AURORA

From Waveshare Wiki
Jump to: navigation, search
AURORA
RPLIDAR-C1

LiDAR + Binocular Vision + IMU
1000M Ethernet, WiFi, USB Type-C
{{{name2}}}

{{{name3}}}

{{{name4}}}

{{{name5}}}

{{{name6}}}

Overview

Introduction

AURORA is a new all-in-one localization and mapping sensor developed by SLAMTEC company, innovatively integrated with LiDAR, vision, IMU, and deep learning. This sensor does not require external dependencies and can achieve indoor and outdoor three-dimensional high-precision mapping and provide six-degree-of-freedom (6DOF) positioning upon startup. At the same time, the product is equipped with a complete tool chain products, including graphical interaction software Robostudio and SDK toolkits for secondary development, allowing users to quickly build personalized applications and accelerate the commercialization of downstream products. This product has the following features:

  • Integrated with LiDAR + Binocular Vision + IMU multi-source fusion algorithm, supports external expansion (GPS/RTK, odometer, etc.)
  • Provide indoor and outdoor 3D mapping and positioning functions
  • Integrate AI technology to improve 3D perception capabilities
  • Has a complete tool chain to support client application extensions
  • Industry-leading system stability


Aurora System Composition

Aurora products provide customers with 3D mapping and positioning capabilities in an integrated form, consisting of LiDAR, binocular camera, and IMU. Aurora supports DC power supply from DC outlets. Users can access real-time maps and positioning data generated by Aurora via WiFi signals, as well as high-speed and stable data access via Ethernet interfaces.


Operating Principle and Usage

Aurora adopts SLAM algorithms uniquely combining laser, vision, and IMU with SLAM algorithm, integrating vision and laser characteristics to achieve more than 10 map data fusions per second and the ability to draw maps up to one million square meters. The system block diagram is shown below, and the output of the system can be defined as the upper layer tool chain for secondary development, including the visual interaction tool Robostudio, C++ SDK, JAVA SDK, Restful API SDK, ROS SDK, etc.

800px-系统框图.jpg


Usage Scenarios

This system is suitable for the following fields:

  • Environmental mapping
  • Construction engineering
  • Indoor and outdoor robot mapping and positioning
  • Humanoid robots, robot dogs

Specifications

Core Performance Indicators

Parameters Specific indicators
Power input DC12V-2A (DC5.5*2.1 mm)
Power 10W (typical)
Data interface 1 x USB-C, 1 × Ethernet (RJ45)
Wireless connection WIFI
Weight 505g
Operating temperature 0℃~40℃


Core Parameter Indicators

Core parameters Specific indicators
2D map resolution 2cm/5cm/10cm adjustable
Maximum mapping area >1,000,000 ㎡
Repositioning Support global repositioning
Mapping continuation Support
Mapping and positioning mode Laser+vision+IMU multi-source fusion
Multi-sensor synchronization mechanism Hardware time synchronization
Lidar distance measurement Maximum range 40m@70% reflectivity
Camera specifications Binocular fish eye global camera, supports HDR, FOV 180°, 6cm baseline
Camera frame rate Typical 10Hz, customizable 15/30Hz
Maximum tilt angle No requirements, (to ensure the 2D mapping effect, it is recommended that the tilt angle does not exceed 30 °)


Product Basic Usage

Precautions

  • Please do not collide. Falls or collisions may cause equipment damage, leading to malfunctions or even complete equipment failure.
  • Keep the radar and lens parts clean and tidy, do not touch them directly with your hands. The device can be cleaned with a cleaning cloth.
  • Ensure proper device cooling; during use, please use a tripod and do not cover the heat dissipation area of the body.


Relay Module Power Supply

  • Interface mode: DC5521
  • Input voltage(current): DC12V (2A)
  • Power supply recommendation:
    • 1. Use a 12V 2A power adapter to meet the normal power supply;
    • 2. Use a battery with an output voltage of 12V and a capacity greater than 5000mAh to meet the normal power supply needs for over 2 hours of continuous operation


Indicator

400px-指示灯.png

  • The status of the indicator lights is as follows:
Indicator status Description
Red long bright Booting up
Yellow flicker Startup complete, waiting for initialization
Yellow long bright System initialization completed, waiting to start mapping
Green long bright Mapping
Red flicker Device exception
Green flicker Pause mapping


Button

400px-按键.jpg

  • Power button:
    • Long press the Power button for 8 seconds, the device enters standby mode
    • In standby mode, short press the Power button to turn on the device
  • Pause button:
    • Short press the Pause button, the device will pause mapping


Ethernet

600px-以太网.jpg

The default configuration mode for Aurora Ethernet is static IP mode with an IP address of 192.168.11.1. The computer is connected to Ethernet and 192.168.11.1 can be accessed through a browser to obtain device information for Aurora and perform simple configuration on Aurora.

WIFI

Aurora has an onboard 2.4G/5G dual-band WiFi chip, which is configured in AP mode by default. After Aurora is turned on, a hotspot named "SLAMWARE-Aurora-xxxxxx" will be automatically generated, and the specific hotspot name can be found on the label of the device.


Host Device Software

Connect to Aurora

  1. Remote UI is a graphical interactive software, Robostudio can establish a connection with Aurora to implement functions such as mapping, positioning, monitoring, and uploading configuration files. Click to download Robostudio and Remote UI, then install Robostudio, and run RoboStudio.exe.
  2. Start RoboStudio, click "File" --> "Robots", right click in the blank space of the "Robot" navigation bar, and select "Manually Connect Robot".
    600px-Robostudio1.jpg
  3. In the pop-up window, enter "192.168.11.1" in the "IP address" field, and then click the "Connect" button to connect the device.
    Robostudio2.jpg
  4. Select "Debug" - > "Scene Strategy", select the appropriate scene strategy in the sidebar, click "Settings", and then click "Restart Application".
  • Scene strategy description:

Aurora supports two scenarios, and users can choose the appropriate scenario strategy according to the actual mapping scenario. The scenarios and their scenario strategies recommended are as follows.

Typical scenario Scene features Strategy
Office buildings, offices, government centers, medical institutions, hotels Laser observation is relatively abundant, and there are many similar scenes in the

environment, which are prone to erroneous closure problems.

Indoor (default)
Large parking lots, shopping malls, subway stations, high-speed rail waiting halls, large government centers, medical institutions, hotel lobbies
Typical outdoor scenes, parks, streets, lawns, circular stadiums, gymnasiums
The scene is wide, the scene area is large, and it is easy to exceed the laser

observation range. The overall observation is relatively sparse, the environment is changeable, and there are various terrain adaptations.

outdoor

Initialize Aurora

  1. Point the Aurora camera at a place with more features. Click Robostudio - > "SLAM" - > "Clear Map". After clearing the map, an "exclamation mark" will be displayed on the RoboStudio interface. Keep the device stationary and wait for initialization to complete.
  2. After initialization is complete, the exclamation mark disappears, the map is displayed, and the indicator light turns green.
  • Please note the following two points:
    • During initialization, ensure that the device is as stable as possible.
    • During initialization, Aurora should target areas with more features within 2-3 meters, avoiding environments with fewer features such as open plains, refractive environments such as large areas of glass, and areas with more dynamic objects to ensure sufficient initialization features and obtain better data results. After standing still for 3 seconds and waiting for the system to successfully initialize, move the device and enter the working state.


Start Mapping

After initialization is complete, you can proceed with mapping. Route planning and advice:

  • Ensure as many observations as possible during the scanning process
  • Try to avoid scanning new areas as much as possible, and you can take a certain loop
  • Avoid the impact of dynamic objects as much as possible.
  • Walk as many closed-loop loops as possible
  • Do not repeat the closed-loop area to reduce memory consumption

Precautions for mapping:

  • Please clear the map before preparing to create a complete new map, otherwise the map optimization engine cannot be guaranteed to take effect.
  • Keep the equipment level. Generally, the equipment should not be tilted more than 20 degrees as much as possible.
  • Keep the equipment stable and avoid significant shaking. Sudden stops or movements will affect the accuracy and effectiveness of mapping to a certain extent.
  • After the loop returns to the starting point, keep the robot moving, take more overlapping paths, and do not stop moving immediately.
  • After returning to the origin of the loop, if the map is not closed, continue walking until the loop is closed.
  • When creating maps with your hand, walk at a normal walking speed. When encountering spaces with fewer features or narrow spaces, or when turning, it is recommended to slow down.
  • When scanning indoor scenes involving multiple rooms or floors, please open the indoor door in advance. When passing through the door, scan slowly and stay on the side of the door for a period of time to ensure that the features on both sides of the door can be scanned at the same time. If the door is not open during scanning, slowly turn around before approaching the door, turn the instrument away from the door, turn your back to open the door, and enter slowly.

In and out:

  • It is necessary to enter and exit sideways to ensure that the laser and vision have a common field of vision before entering, and better connect the data.
  • Entering and exiting a confined space: After scanning a confined space, it is necessary to observe whether the reference objects are sufficient and whether the structural features are obvious during the scanning process.

If the above two conditions are not met, when exiting, try to align the perspective with areas with good structured features as much as possible, while avoiding excessive perspective switching.


View Trajectories and Point Clouds

After synchronization, AuroraCore-remote Visualizer can be used to view visually generated trajectories and point clouds.

  1. Double-click aurar_remote.exe to run AuroraCore-remote Visualizer. In the pop-up window, enter the IP 192.168.11.1 in the "Input the sddress Manually" field, and then click the "Connect" button to connect the device.
    600px-Remote1.png
  2. Click "Toggle Frame View" on the right toolbar to display the images and feature points observed by the camera.
    600px-Remote2.png


Save Map

Click "File" in RoboStudio - > "Map Editor" - > "Save to File" to save the map to your computer.

800px-Robostudio3.jpg


Resources

Documents

SDK


Support




Technical Support

If you need technical support or have any feedback/review, please click the Submit Now button to submit a ticket, Our support team will check and reply to you within 1 to 2 working days. Please be patient as we make every effort to help you to resolve the issue.
Working Time: 9 AM - 6 PM GMT+8 (Monday to Friday)