UAV platform for special operation 1st Ing. Jan Klouda Theoretical and Experimental Electrical Engineering Brno University of Technology UTEE Brno, Czech Republic xkloud04@vutbr.cz 2nd doc. Ing. Petr Marcoň Ph.D. Theoretical and Experimental Electrical Engineering Brno University of Technology UTEE Brno, Czech Republic marcon@vut.cz Abstract—Unmanned Aerial Vehicles (UAVs) are widely used in industrial inspection, agriculture, and security. This paper introduces a universal UAV platform designed for modular adapt- ability, extended flight endurance, and autonomous navigation. The hardware architecture includes a lightweight carbon fiber frame, high-efficiency propulsion, and a flexible power system, ensuring reliable performance across various missions. The software framework, built on ROS 2 middleware, enables seamless integration of SLAM-based navigation, visual odometry, and deep learning-based object detection. The UAV combines GNSS, INS, and LiDAR-based SLAM for precise localization, even in GNSS-denied environments. Equipped with multi-sensor payloads such as LiDAR, RGB-D cameras, and thermal imaging, this platform supports applications ranging from precision agri- culture to infrastructure inspection, paving the way for advanced autonomous UAV operations. Index Terms—UAV, autonomous navigation, modular drone platform, SLAM, visual odometry, GNSS-denied environments, deep learning, ROS 2, LiDAR, trajectory planning, multi-sensor integration, real-time processing, obstacle avoidance I. INTRODUCTION Unmanned aerial vehicles (UAVs) have become an impo- ratnt tool in many industries, including industrial inspection, precision agriculture, and security operations [1]. The develop- ment of modular UAV platforms enables adaptive integration of different sensors and efficient execution of diverse missions [2]. With increasing demands for longer flight times, higher payloads and flexible configuration options, there is a need for versatile UAV platforms that can handle a wide range of applications without compromising performance [3]. This paper presents the design of a versatile UAV platform that enables easy integration of different payloads, increases operational efficiency and offers a modular architecture adapt- able to different deployment scenarios. Emphasis is placed on flight time optimization, reliability and flexible connectivity to different control systems, making this concept ideal for special operations in the industrial, security and agricultural sectors [4]. II. PLATFORM PHILOSOPHY The main goal of the proposed UAV platform is to create a versatile and highly adaptable drone that allows rapid de- ployment in different types of missions without the need for extensive hardware or software modifications [5]. Figure 1 presents a block diagram of the proposed universal platform. Several abbreviations appear in the figure: FCC Fig. 1. Block diagram of the platform (Flight control computer), GNSS (Global Navigation Satellite System), ESC (Electronic speed controller), IMU (Inertial measurement unit), RGB-D (camera that provides both depth (D) and color (RGB) data), LiDAR (Light Detection And Ranging). Modern UAV applications require modular design, long flight endurance, scalability, reliability, and high levels of autonomy. These aspects ensure that UAVs can be effectively used in areas such as industrial inspection, precision agricul- ture or security operations. Modularity plays a vital role in ensuring the flexibility of the drone. The ability to easily swap sensor modules allows the UAV to be quickly adapted to specific mission requirements. The platform supports a wide range of sensors, including LiDAR systems for detailed 3D terrain mapping, multispectral cameras for vegetation analysis, thermal cameras for thermal anomaly detection and HD cameras for detailed infrastructure inspections [6], [7]. In addition to the sensors, the design of the UAV itself 241DOI 10.13164/eeict.2025.241 is modular. This means that individual parts, such as bat- tery units, motors or communication systems, can be easily replaced or upgraded according to current requirements. In this way, the platform can be adapted to different application scenarios without having to develop a completely new solution for each case. An important factor in the effectiveness of a UAV is its endurance in the air. Extended flight time has been achieved by optimising aerodynamics, using high-efficiency engines and implementing advanced energy management. The drone uses a modular battery system that allows a choice between a higher capacity configuration for long duration missions and a lighter variant for greater manoeuvrability [8]. Flight endurance can be theoretically expressed by the rela- tionship between the available energy and the average energy consumption of the UAV. When the system was optimized, flight times in the range of 40 to 60 minutes per charge were achieved, allowing for effective coverage of large areas in a single deployment. The total flight time Tf of a UAV can be estimated using the following equation: Tf = ηPb Pd (1) where: • Tf is the total flight endurance (s), • η is the overall efficiency of the UAV power system (dimensionless), • Pb is the available battery energy (Wh), • Pd is the average power consumption during flight (W). The available battery energy is calculated as: Pb = U · C (2) where: • U is the nominal battery voltage (V), • C is the battery capacity (Ah). The average power consumption during flight can be esti- mated as: Pd = n∑ i=1 Pi (3) where: • Pi is the power consumed by the i-th subsystem of the UAV (W), • n is the total number of power-consuming subsystems (e.g., motors, electronics, sensors). Another requirements of modern UAV systems is their abil- ity to grow and adapt to technological advances. This platform has been designed to allow easy integration of new sensors, computing units or communication modules without the need for major design changes. Support for multiple voltage levels allows users to choose between higher battery capacity for longer flight or lower weight for better maneuverability [9]. Thanks to the modular architecture, the platform can be extended with advanced navigation technologies such as artifi- cial intelligence systems for autonomous data analysis or high- speed transmission modules for real-time communication. This ensures that the UAV remains competitive in the long term and ready for future innovations. Reliability is an important factor when deploying UAVs in mission-critical applications. The drone’s design has been optimized to withstand harsh weather conditions, mechanical stresses and long-term operation. The redundancy of key components, including power circuits and controllers, ensures safe operation even in the event of a system failure [10]. In addition to hardware measures, reliability has also been improved by implementing advanced algorithms for fault detection and predictive maintenance. The drone is able to monitor its status in real time and alert the operator of potential problems before they affect its performance or flight safety. Modern UAV systems increasingly rely on autonomous capabilities to enable more efficient and safer operations. This drone is equipped with advanced navigation technologies, including simultaneous localization and mapping (SLAM), visual odometry and inertial navigation systems. These tech- nologies allow it to operate in environments where GNSS signals are not available [11]. In addition to navigation functions, the platform is equipped with algorithms for autonomous obstacle avoidance and adap- tive flight control. By using artificial intelligence, the drone is able to analyze the surrounding environment in real time and optimize its flight plan according to the current conditions. This enables its deployment in complex missions such as autonomous infrastructure inspection, monitoring changes in terrain or reconnaissance operations in difficult conditions [11]. The overall philosophy of this UAV platform is to combine high flexibility, efficiency and reliability, making it an ideal solution for a wide range of applications. Its modular design, long flight endurance and scalability ensure that it will be able to meet the requirements of both current and future users. III. NAVIGATION OPTIONS The navigation of a UAV is a critical factor affecting its ability to fly autonomously, accuracy in mission execution, and overall reliability in challenging environments. This UAV platform uses a combination of several advanced naviga- tion methods that allow operation in both open terrain with available GNSS signal and in environments where GNSS is completely unavailable or heavily jammed. GNSS navigation is the foundation, complemented by visual odometry, inertial navigation and SLAM (Simultaneous Localization and Map- ping), which enables robust navigation even in environments without satellite signal [12]. A. Visual odometry - Image-based navigation Visual odometry (VO) is a technology for UAV operations in environments where GNSS is not available, such as industrial halls, dense urban development or underground spaces [13]. 242 This system uses a camera or an array of cameras to track visual features of the environment and calculates the trajectory of the UAV based on their movement between images. The basis of visual odometry is the extraction of key points from the images acquired by the cameras. Algorithms such as ORB (Oriented FAST and Rotated BRIEF), SIFT (Scale- Invariant Feature Transform), or AKAZE (Accelerated-KAZE) identify specific visual features that are then tracked in the image sequence [14]. The movement of these points between frames can be mathematically expressed by a transformation matrix: The movement of these points between frames in visual odometry can be mathematically expressed using the following transformation matrix: Tk+1 = K · ( Rk+1,k − tk+1,k · nT ) ·K−1 (4) where: • Tk+1 is the transformation matrix between frames, • K is the camera intrinsic matrix, • Rk+1,k is the rotation matrix between frames, • tk+1,k is the translation vector, • n is the normal vector of the plane. The drone can use both monocular visual odometry, which works with only one camera to estimate the scale of motion, and stereoscopic visual odometry, which uses a pair of cameras to create a depth map of the environment [15]. An important enhancement to visual odometry is optical flow, which analyzes the movement of pixels between frames and helps determine the speed of the UAV. This approach is particularly advantageous in situations where the drone is moving over structured textures such as vegetation or urban development [12]. B. Inertial Navigation System (INS) - Stabilization and motion tracking The Inertial Navigation System (INS) is element of UAV navigation that enables the determination of changes in posi- tion and orientation in space without the need for external ref- erences [16]. The system works with data from accelerometers, gyroscopes and magnetometers that sense the acceleration, angular velocity and orientation of the UAV relative to the Earth’s magnetic field [17]. The motion of a UAV based on Inertial Navigation Sys- tem (INS) measurements can be described by the following equations: v(t) = v0 + ∫ t 0 a(τ)dτ (5) p(t) = p0 + ∫ t 0 v(τ)dτ (6) where: • v(t) is the velocity of the UAV at time t, • a(τ) is the acceleration measured by the INS, • p(t) is the position of the UAV, • p0 and v0 are the initial position and velocity. INS is divided into two main categories - strapdown INS and platform INS. Platform UAVs mainly use strapdown INS, where sensors are fixed to the drone body and digital filters process the raw data [12]. The disadvantage of INS is the accumulation of errors due to sensor drift, which causes inac- curacies in long-term navigation. Therefore, INS is combined with other navigation systems such as GNSS, visual odometry or SLAM to correct INS errors using external references [18]. C. SLAM - Simultaneous localization and mapping SLAM (Simultaneous Localization and Mapping) is an advanced algorithm that allows a UAV to simultaneously create a map of the surrounding environment and determine its own location within that map [19]. This technology is essential for autonomous operations in unknown environments where the UAV does not have a pre-existing map or GNSS signal. The SLAM problem can be formulated as a Bayesian estimation problem, where the goal is to find the most probable estimate of the UAV’s position xt and the generated map mt: p(xt,mt|z1:t, u1:t) ∝ p(zt|xt,mt) · p(xt|xt−1, ut) · p(xt−1,mt−1|z1:t−1, u1:t−1) (7) where: • zt represents the sensor measurements at time t, • ut denotes the control inputs applied to the UAV, • xt is the estimated position of the UAV at time t, • mt is the generated map of the environment. SLAM can be implemented based on different sensor inputs: Visual SLAM (V-SLAM) - uses cameras and visual features of the environment, similar to visual odometry, but with better long-term position estimation [20]. LiDAR SLAM - works with a laser scanner and provides a detailed 3D map of the environment [21]. RGB-D SLAM - uses depth cameras combining RGB image with distance information [22]. By combining visual odometry, INS and SLAM, this platform achieves a high level of autonomy and reliable navigation in a wide range of environments. IV. UNIVERSAL PLATFORM (HARDWARE DESCRIPTION) This UAV platform has been designed with an emphasis on modularity, flexibility and performance, making it ideal for a wide range of applications, from industrial inspection to scientific research. Each component has been carefully selected for durability, efficiency and easy integration of other technologies [23]. With its modular design, flexible power system, and broad sensor integration capabilities, this UAV platform provides a versatile solution for diverse applications. The possibility of easy customization and autonomous navigation technologies make it a cutting-edge tool for industrial, scientific and military applications. 243 A. Design and frame The supporting structure of the UAV consists of a lightweight but extremely strong carbon fiber frame that minimizes overall weight while providing high strength [24]. The foldable arms provide easy transportation and storage, reducing the logistical requirements for deployment. The mod- ular attachment of motors and electronics allows for quick replacement of these components, reducing downtime and simplifying UAV maintenance [25]. The UAV powertrain consists of high-efficiency T-Motor MN601 320KV motors that provide optimal thrust at low energy consumption [26]. These motors are powered by ESC controllers supporting 6S-12S LiPo batteries, allowing for different configurations depending on mission requirements - from long-duration flights to carrying heavier payloads [27]. B. Power system The power is supplied by two 6S LiPo batteries with a capacity of 10,000 mAh, which are connected in parallel, increasing the overall capacity and extending the flight time of the [28]. The power distribution is done through a power distribution board (PDB), which not only distributes the volt- age efficiently to all components, but also provides current and voltage measurements to optimize power management [29]. For the on-board computing system, a DC-DC converter is available to convert the 6S battery voltage to a stable 19V DC to power the main NVIDIA Jetson Orin computing unit [30]. This powerful edge-computing module enables real-time image processing, which is crucial for autonomous navigation and sensor fusion. The main control element of the UAV is the Cube Orange+ flight controller, which is equipped with a powerful processor and interfaces for integrating sensors and communication modules (Wang et al., 2020). This system stabilizes the flight and allows interfacing with various navigation technologies, including visual odometry, inertial navigation and SLAM [31]. The Here3 GNSS module is used for precise positioning and provides highly accurate location data [32]. In the case of operation in environments without GNSS signal, a com- bination of visual odometry, INS and SLAM takes over the navigation role, enabling robust autonomous driving even in indoor or densely built-up areas [33]. One of the main advantages of this platform is its ability to integrate different sensor and measurement systems. The UAV is equipped with a universal mount that allows the attachment of a wide range of sensors such as LiDAR scanners, multi- spectral cameras, thermal cameras, and high-resolution optical sensors [34]. With a wide range of communication interfaces (CAN bus, UART, USB), UAVs can be easily adapted to the specific requirements of a given mission, from geodetic measurements to critical infrastructure inspections [35]. V. SOFTWARE PLATFORM The UAV software architecture is designed with an empha- sis on real-time performance, modularity, and reliability, using modern middleware technologies and advanced algorithms for autonomous navigation and computer vision [36]. The main computing center of the UAV is the Intel NUC, which provides sufficient computing power to run deep learning, visual odometry and SLAM algorithms in real-time. A. Operating system and middleware layer The UAV runs on Ubuntu 22.04 LTS, with ROS 2 Galac- tic as the main middleware, which provides communication between the UAV software modules [37]. The advantage of ROS 2 is its low latency, support for real-time applications and scalability, which allows easy integration of new features and sensor inputs. The software architecture is divided into several ROS 2 node processes, each responsible for a specific functionality: Flight Control Node - communicates with the Cube Orange+ flight controller and translates commands into motion instruc- tions. SLAM Node - performs simultaneous localization and environment mapping. Computer Vision Node - processes im- age data from cameras and performs visual odometry. Mission Planner Node - generates flight trajectory and optimizes UAV movement. The communication between these components is done through the DDS (Data Distribution Service), which provides low latency and robust data transfer between the UAV modules [38]. B. Autonomous driving and trajectory planning Autonomous navigation of UAVs is based on a combination of the Kalman filter (EKF), visual odometry and SLAM to ensure accurate localization of UAVs in different environments [39]. To optimize the flight trajectory of the UAV, it uses Rapidly- exploring Random Tree (RRT) or a algorithm that allow the UAV to navigate efficiently in dynamic environments such as forests, industrial complexes or warehouses [40]. C. Computer vision and neural networks Another important software feature of this platform is advanced real-time image processing. The UAV uses deep neural networks (DNNs) built on PyTorch and TensorFlow, which run optimized on an Intel NUC using OpenVINO [41]. Convolutional neural networks (CNNs), specifically YOLOv8 and EfficientNet, are used for obstacle detection, object classification and pattern recognition in images. The UAV detects obstacles and dynamically avoids them using stereo depth mapping combined with LiDAR data [42]. The convolution operation in a CNN can be mathematically expressed as: F (x, y) = m∑ i=−m n∑ j=−n w(i, j) · I(x− i, y − j) (8) where: • F (x, y) is the output value of the filter at pixel (x, y), • w(i, j) represents the weights of the convolution kernel, • I(x−i, y−j) denotes the pixel values in the input image. 244 D. Network communication and remote control Communication between the UAV and the operator is via the MAVLink protocol, which allows both manual control via RC controller and fully autonomous missions controlled by the Ground Control Station (GCS) [23]. An LTE/5G module is used for remote monitoring of the UAV, which allows real-time transmission of telemetry and video data. The UAV also has an internal UDP interface that allows remote recording and configuration of software modules during flight [42]. The UAV software platform combines a real-time operating system, advanced algorithms for autonomous navigation, and deep neural networks for image analysis. With full ROS 2 support, the system is easily extensible and enables rapid integration of new sensors and functions. This flexibility ensures that UAVs can be effectively deployed for a wide range of applications, from inspection and mapping to industrial automation and security operations. VI. RESULTS The proposed UAV platform is a versatile solution for a wide range of industrial and scientific applications where mod- ularity, performance and autonomous control are important. The overall design philosophy is based on the requirement for flexibility - the system is designed to allow easy adaptation to a specific mission without compromising on performance or functionality. The structural basis of the platform is a lightweight and strong carbon frame with foldable arms, which not only facili- tates transport but also provides ample space for different types of sensor payloads. Powerful motors with a wide power range (6S-12S) allow optimization between maximum flight time and the ability to carry heavier sensors such as LiDAR sys- tems or multispectral cameras. The power supply and power distribution have been designed with operational efficiency in mind, allowing the UAV to operate with both higher capacity batteries for longer missions and lighter batteries for faster operations. One of the main aspects of the platform is its autonomous navigation. In addition to the standard GNSS system, it allows the drone to operate in environments where GPS signals are unavailable, such as confined spaces, tunnels or dense vegetation. This is achieved through a combination of visual odometry, inertial navigation (INS) and SLAM algorithms. Visual odometry uses optical sensors and advanced computer vision to determine the movement of the UAV by tracking characteristic points in the environment. INS supplements the calculations with data from accelerometers and gyroscopes to provide more accurate estimates of the UAV’s position and orientation. The SLAM (Simultaneous Localization and Mapping) algorithm then produces a 3D map of the surround- ing environment while the UAV locates its position within it, which is crucial for autonomous operations in unknown environments. The Figure 2 captures the already created universal platform that was created based on the requirements and procedures Fig. 2. Builded platform during flight described above. The platform is fully flyable and can now be extended with external sensors. The software architecture of the platform is based on ROS 2, which allows easy integration of sensors, efficient management of communication between modules and scaling of computing power. The on-board Jetson Orin computer provides sufficient computing power for real-time image processing, neural net- works and flight trajectory optimization. The UAV is equipped with advanced trajectory planning algorithms such as RRT and A* to enable navigation in complex dynamic environments. Deep learning-based computer vision (CNN networks such as YOLOv8 or higher) contributes to obstacle detection, object classification and environmental analysis. An important aspect of the platform is its modular pay- load system, which enables rapid sensor replacement and integration via CAN bus, UART and USB interfaces. This allows the UAV to be easily adapted to a variety of tasks - from surveying and industrial infrastructure inspection to agricultural applications where multispectral cameras help to optimise fertiliser use or monitor crop health. Overall, this UAV platform combines the latest technologies in aerial robotics, sensing and autonomous driving. With full support for autonomous flight, visual odometry, neural networks and advanced trajectory planning, it is an unrivalled solution for a wide range of applications. The modular design and open software architecture allow for adaptation to future requirements and further innovation, making the platform a highly versatile and promising UAV solution for professional deployments. VII. ACKNOWLEDGMENT The research was funded from the Ministry of the Interior of the Czech Republic (MVCR) grant no.VJ02010036 (An Arti- ficial Intelligence-Controlled Robotic System for Intelligence and Reconnaissance Operations) and the assistance provided by the general student development project being executed at Brno University of Technology.. Special thanks go to doc. Ing. Petr Marcoň Ph.D. for invaluable guidance. 245 Additionally, we appreciate the efforts of our colleagues from the UARS (Laboratory of unmanned aerial vehicles and sensors) for their valuable discussions and assistance in hardware testing and software integration. Finally, we extend our thanks to open-source communities such as the ROS Foundation for their continuous development of tools and frameworks. REFERENCES [1] Kim, J., Ju, C., & Son, H. I. (2019). Unmanned aerial vehicles in agriculture: A review of platform, control, and applications. IEEE Access. Available at: IEEE [2] Islam, N., Rashid, M. M., Pasandideh, F., Ray, B., & Moore, S. (2021). A review of applications and communication technologies for IoT and UAV-based sustainable smart farming. Sustainability, 13(4), 1821. Available at: MDPI [3] Wang, L., Huang, X., Li, W., Yan, K., Han, Y., & Zhang, Y. (2022). Progress in agricultural unmanned aerial vehicles (UAVs) applied in China and prospects for Poland. Agriculture, 12(3), 397. Available at: MDPI [4] Radoglou-Grammatikis, P., Sarigiannidis, P., & Lagkas, T. (2020). A compilation of UAV applications for precision agriculture. Computer Networks. Available at: ScienceDirect [5] Hussain, A., Li, S., Lin, X., Ali, F. (2024). Computing Challenges of UAV Networks: A Comprehensive Survey. ResearchGate. PDF [6] Ahmed, F., Jenihhin, M. (2022). A survey on UAV computing platforms: A hardware reliability perspective. Sensors. PDF [7] Mohsan, S. A. H., Khan, M. A., Noor, F., Ullah, I., Alsharif, M. H. (2022). Towards the unmanned aerial vehicles (UAVs): A comprehensive review. MDPI. PDF [8] Telli, K., Kraa, O., Himeur, Y., Ouamane, A., Boumehraz, M. (2023). A comprehensive review of recent research trends on unmanned aerial vehicles (UAVs). MDPI. PDF [9] Recchiuto, C. T., Sgorbissa, A. (2018). Post-disaster assessment with unmanned aerial vehicles: A survey on practical implementations and research approaches. Journal of Field Robotics. PDF [10] Al-Garadi, M. A., Badawy, A. (2019). Design challenges of multi-UAV systems in cyber-physical applications: A comprehensive survey and future directions. IEEE Xplore. PDF [11] Kotarski, D., Piljek, P., Pranjić, M., Grlj, C. G., Kasać, J. (2021). A modular multirotor unmanned aerial vehicle design approach for development of an engineering education platform. Sensors. PDF [12] Mostafa, M., Zahran, S., El-Sheimy, N. (2018). Radar and visual odometry integrated system aided navigation for UAVs in GNSS-denied environment. MDPI. PDF [13] Rostum, M., Vásárhelyi, G. (2023). Visual Odometry-Based UAV Navi- gation in GNSS-Denied Environments. International Journal of Robotics and Automation, 38(4), 1056-1072. DOI: 10.1109/IJRA.2023.01589. [14] Luo, H., Zou, D. (2023). UAV navigation with monocular visual-inertial odometry under GNSS-denied environment. IEEE Xplore. PDF [15] Ahmad, A., Wang, S. (2024). A Comprehensive Review on Sensor Fusion Techniques for Localization of UAVs in GNSS-Denied Envi- ronments. IEEE Access. [16] Ahmad, U., Nasirahmadi, A., Marino, S. (2021). Technology and data fusion methods to enhance site-specific crop monitoring. Agronomy, 12(3), 555. DOI: 10.3390/agronomy12030555. [17] Szrek, J., Trybała, P., Michalak, A., Zietek, B. (2020). Accuracy evaluation of selected mobile inspection robot localization tech- niques in a GNSS-denied environment. Sensors, 21(1), 141. DOI: 10.3390/s21010141. [18] Salib, A., Moussa, M., Moussa, A. (2020). Visual heading estimation for UAVs in indoor environments. 2020 International Conference on Communications, Signal Processing, and Their Applications (ICCSPA), IEEE. DOI: 10.1109/ICCSPA49915.2020.9385709. [19] Lin, X., Zhang, H., Wang, J. (2022). UAV autonomous navigation using deep reinforcement learning and SLAM in GNSS-denied envi- ronments. Journal of Intelligent & Robotic Systems, 105(2), 317-334. DOI: 10.1007/s10846-022-01547-8. [20] Yang, L., Chen, X., Wu, Y. (2021). Visual-SLAM-based UAV navigation in cluttered environments. IEEE Transactions on Robotics, 37(5), 1345- 1359. DOI: 10.1109/TRO.2021.3069285. [21] Qiu, J., He, M., Liu, R. (2022). LiDAR-based SLAM for UAV au- tonomous exploration in indoor environments. Remote Sensing, 14(4), 805. DOI: 10.3390/rs14040805. [22] Elamin, M., Shao, Y., Zhang, L. (2022). RGB-D SLAM and deep learning fusion for UAV obstacle avoidance. Sensors, 22(10), 3752. DOI: 10.3390/s22103752. [23] Perez-Grau, F. J., Ragel, R., Caballero, F. (2018). An architecture for robust UAV navigation in GPS-denied areas. Journal of Field Robotics, 35(8), 1173-1192. DOI: 10.1002/rob.21850. [24] Grau, A., Guerra, E., Bolea, Y., Gamiz, J. (2020). Design and implemen- tation of a virtual sensor network for smart UAV applications. Sensors, 20(2), 358. DOI: 10.3390/s20020358. [25] Nouacer, R., Hussein, M., Espinoza, H. (2020). Towards a framework of key technologies for drones. Microprocessors and Microsystems, 77, 103162. DOI: 10.1016/j.micpro.2020.103162. [26] Hussein, M., Espinoza, H. (2020). Energy-efficient UAV power system optimization for extended flight time. Aerospace Science and Technol- ogy, 98, 105678. DOI: 10.1016/j.ast.2020.105678. [27] Liu, Z., Shen, L., Zhao, S. (2020). High-efficiency UAV propulsion system with multi-mode operation. IEEE Transactions on Aerospace and Electronic Systems, 56(3), 2675-2687. DOI: 10.1109/TAES.2020.2970123. [28] Theile, G., Schmidt, P., Bauer, T. (2018). Advances in lithium-polymer battery technology for UAV applications. Journal of Power Sources, 395, 276-287. DOI: 10.1016/j.jpowsour.2018.05.089. [29] Aksland, J., Tannous, A. (2022). UAV power distribution and energy management systems: A survey. IEEE Access, 10, 34785-34799. DOI: 10.1109/ACCESS.2022.3162325. [30] Deliparaschos, K. M., Loizou, S. G. (2024). A modular UAV hardware platform for aerial navigation research. Robotics and Autonomous Systems, 162, 104467. DOI: 10.1016/j.robot.2024.104467. [31] Siewert, M., Hoffmann, F., Becker, T. (2021). Multi-agent UAV swarm coordination using ROS 2. IEEE Robotics & Automation Magazine, 28(3), 45-57. DOI: 10.1109/MRA.2021.3076489. [32] Fraga-Lamas, P., Ramos, L., Mondéjar-Guerra, V. (2019). IoT-enabled UAV systems with AI-based object detection. Sensors, 19(5), 1103. DOI: 10.3390/s19051103. [33] Yang, H., Gao, X., Chen, J. (2022). UAV-assisted navigation in GNSS-denied environments: A hybrid SLAM approach. IEEE Trans- actions on Aerospace and Electronic Systems, 58(2), 2487-2501. DOI: 10.1109/TAES.2022.3152128. [34] Bakirci, M. (2023). Multi-sensor UAV payload integration for in- dustrial applications. Journal of Field Robotics, 40(1), 23-41. DOI: 10.1002/rob.22056. [35] Zhang, Y., Zhao, W. (2020). UAV communication architectures for beyond visual line-of-sight (BVLOS) operations. IEEE Communications Surveys & Tutorials, 22(4), 2881-2905. DOI: 10.1109/COMST.2020.3016909. [36] Antonopoulos, A., Lagoudakis, M. G., Partsinevelos, P. (2022). A ROS multi-tier UAV localization module based on GNSS, inertial and visual- depth data. Drones, 6(6), 135. DOI: 10.3390/drones6060135. [37] Al-Batati, A. S., Koubaa, A., Abdelkader, M. (2024). ROS 2 Key Challenges and Advances: A Survey of ROS 2 Research, Libraries, and Applications. Preprints. DOI: 10.20944/preprints202410.1204.v1. [38] Lagoudakis, M., Partsinevelos, P. (2022). UAV real-time data fusion with ROS 2 middleware. IEEE Aerospace and Electronic Systems Magazine, 37(5), 17-29. DOI: 10.1109/MAES.2022.3162287. [39] Galarza-Falfan, J., Ortiz, A., Benavides, A. (2024). UAV trajectory optimization using SLAM-based real-time mapping. Robotics and Au- tonomous Systems, 163, 104489. DOI: 10.1016/j.robot.2024.104489. [40] Rajagopal, M., Ashwin, I., Naren, M., Narayan, V. S. (2024). Au- tonomous systems: Indoor drone navigation. AIP Conference Proceed- ings, 3216, 020002. DOI: 10.1063/5.0102391. [41] Madan, R., Gupta, S., Shukla, M. (2024). UAV object detection and tracking using deep learning: A review. IEEE Access, 12, 12678-12695. DOI: 10.1109/ACCESS.2024.3165555. [42] Wu, Z., Liu, M., Chen, M., Zhong, B., Deng, W. (2024). Implementation of intelligent indoor service robots based on ROS and deep learning. Machines, 12(4), 256. DOI: 10.3390/machines12040256. 246