This study explores the design and implementation of a search algorithm for the DustySWARM team in the NASA Swarmathon 2017 physical competition. The competition challenges autonomous robotic systems, known as "swarmies," to collaboratively locate and collect resources in a simulated environment. To address this challenge, three search strategies were developed and evaluated: the square spiral path, the spiral path, and the Epicycloidal wave path. Each method aimed to optimize resource collection efficiency while maintaining communication and coordination among the swarmies. Experimental results revealed that the Epicycloidal wave path was the most effective, consistently outperforming other strategies by collecting the highest number of resources within the competition’s time constraints. This paper outlines the algorithm development process, detailing the design considerations, coding techniques, and testing procedures that contributed to the success of the Epicycloidal wave approach. The findings underscore the importance of strategic path planning and robust coordination in enhancing the performance of autonomous robotic swarms in resource collection tasks.
This project focuses on the design and implementation of an optimal search algorithm for a team of autonomous rovers, referred to as "Swarmies," developed by NASA for the 2018 NASA Swarmathon Physical Competition. Swarmies are compact, cooperative robots that mimic the behavior of ants in search of simulated Mars objects. The objective of this study is to enhance the effectiveness of these autonomous rovers in locating resources by developing and refining search algorithms. The team, TAMIU DustySWARM3.0, evaluated several search algorithms, including the Epicycloidal Spiral Wave, Fibonacci, and Snake Path, developed in previous iterations of the DustySWARM project. Through extensive simulations and real-world trials, a square-spiral search path was identified as the most efficient for resource collection in the competition. This paper provides a comprehensive overview of the system engineering process, algorithm design, and code development involved in implementing the square-spiral path, with a focus on computer science methodologies. The study demonstrates how the integration of optimal algorithms, testing, and systems design can advance the capabilities of autonomous swarm robots in Mars exploration, highlighting key contributions to the field of swarm robotics and their potential applications in space exploration.
This paper presents the development and implementation of a Reverse-Twister search algorithm designed to optimize resource collection in a swarm of autonomous robots for space exploration. The algorithm was created by the DustySWARM NASA Robotics team with the goal of improving the efficiency of swarm-based search techniques in space exploration missions. The Reverse-Twister code focuses on coordinating multiple robots to autonomously navigate and collect resources within a simulated environment. The results of the final version of the algorithm show a significant improvement in the volume of resources collected by the swarm of robots within the given time constraints. The Reverse-Twister approach enhances robot coordination, obstacle avoidance, and search efficiency, ultimately making it a promising solution for future space exploration missions. This paper outlines the design, coding, and testing of the Reverse-Twister algorithm, demonstrating its potential for improving autonomous search capabilities in extraterrestrial environments.
With the rapid advancement of technology and the increasing reliance on data acquisition and processing, uncertainty data has gained widespread application across fields such as finance, military, logistics, and telecommunications. Traditional data management methods, however, are not equipped to handle uncertain data effectively, leading to a growing focus on uncertainty data management within data mining research. Among the various techniques in this field, outlier detection stands out due to its ability to identify data points that deviate from the norm, with key applications in areas like network intrusion and sensor network detection. While significant progress has been made in outlier detection for deterministic data, uncertainty data presents unique challenges. In this study, we propose a new outlier detection method based on the possible world model for attribute-level uncertain data. First, we improve the anomaly score calculation method of iForest to make it suitable for uncertain data. Next, we redefine the concept of local outliers in the context of uncertainty data. To enhance efficiency, we apply iForest in combination with K nearest neighbour query optimization to reduce the candidate set without expanding the possible world. Experimental results demonstrate that the proposed algorithm significantly improves detection accuracy, reduces time complexity, and enhances the outlier detection performance for uncertain data.
Breast cancer is a significant health concern, affecting one in eight women during their lifetime. Early detection plays a crucial role in reducing the risks associated with the disease, and mammography has proven to be an effective screening method. Mammograms often show early signs of breast cancer, such as microcalcifications, which appear as white spots on the images. However, the accuracy of early detection depends not only on the quality of the mammograms but also on the ability of radiologists to interpret them correctly. This research focuses on enhancing poor-quality mammogram images, specifically improving the Region of Interest (ROI). The paper details the image enhancement techniques used to improve mammogram quality, ensuring clearer visualization of critical features such as microcalcifications. By applying these methods, the paper aims to provide better tools for radiologists, improving the early detection and diagnosis of breast cancer.