Posted
about 3 years
ago
by
rishabsingh3003
When was the last time someone showed you a camera capable of object detection like this, without using a companion computer or any other external hardware? Today I’d like to introduce a
... [More]
camera capable of this, and much more!
For the past few months, I have had the opportunity to work on a fascinating new Camera. It is the OAK-D series from Luxonis. I have been experimenting with the original OAK-D and the OAK-D-IOT-75 for a while now. In this blog (Part 1 of a series of blogs I hope to post), I want to give a small description of them:
OAK-D:
This is essentially an RGB-D Camera (Stereo Camera + RGB Camera system, much like the Intel RealSense D435i). If you are not familiar with Stereo Cameras, they help us essentially generate a wide 3-D point cloud (3-D geometrical coordinates to the image on a real-world scale) via two or more cameras placed parallel to each other. This can be used for several applications, like Obstacle Avoidance. This technology isn’t new, and we have been working with the RealSense cameras for a long time.
What’s new about this particular camera is its ability to run AI/Computer Vision algorithms onboard the camera. As you may be aware, most applications in Computer Vision and Drones require a heavy (potentially expensive) Companion Computer onboard. This is a considerable bottleneck for any user to experience this technology. Well, not anymore! Things like detecting humans, everyday objects, animals can be done at a relatively high refresh rate inside the camera (with no additional hardware required, just any low powered USB-enabled companion computer, like an RPi Zero). Even training the camera to detect custom objects for your tailor-made applications are super easy!
What makes this sensor even better for ArduPilot is the fact that the detected objects can be tracked in 3D space via the stereo camera. So in essence, the camera outputs the detected object and its 3D coordinates to the Host (The device where the camera is connected too). Alternatives like the RealSense series had to transfer the entire Image frames onto the host, which required USB3 (Terrible for EM noise!) and a relatively fast Companion Computer. Other neat features include: It has onboard scripting capabilities (much like Lua scripting for ArduPilot), inbuilt IMU, object tracker support via inbuilt EKF and other algorithms, amongst many many more features.
I would also like to mention some of the issues that I have faced with the camera: There are issues with focusing on the RGB camera when it is put in high-vibration environments. This is being addressed by Luxonis by launching a “fixed-focus” version of the camera in the near future. A comparison of fixed focus vs auto focus can be seen here:https://www.youtube.com/embed/jpnsTsCGbQk
The other issue is that the quality of the Depth Frames isn’t as good as the RealSense cameras. They definitely have more noise. I have found workarounds, though. The Luxonis team plans to launch OAK-D-PRO, which should again improve this.
OAK-D-IOT-75:
This is the EXACT same camera as the OAK-D. Except: it has an ESP32 (with WiFi and BT) on board, which has a perfect use case for us ArduPilot enthusiasts! The camera can directly talk to the ESP32, and the ESP32 can be used to interface with the Flight Controller (via the serial port). This means we can finally have Computer Vision/AI applications without needing any Companion Computer! Directly plug in the camera to your Flight Controller and no need for any other equipment!
This is the camera that I have spent the last few weeks integrating and making new example scripts for. If you are interested in my development and integration work, please read the second part of this blog for some interesting applications!
26 posts - 9 participants
Read full topic
[Less]
|
Posted
about 3 years
ago
by
rishabsingh3003
When was the last time someone showed you a camera capable of object detection like this, without using a companion computer or any other external hardware? Today I’d like to introduce a
... [More]
camera capable of this, and much more!
For the past few months, I have had the opportunity to work on a fascinating new Camera. It is the OAK-D series from Luxonis. I have been experimenting with the original OAK-D and the OAK-D-IOT-75 for a while now. In this blog (Part 1 of a series of blogs I hope to post), I want to give a small description of them:
OAK-D:
This is essentially an RGB-D Camera (Stereo Camera + RGB Camera system, much like the Intel RealSense D435i). If you are not familiar with Stereo Cameras, they help us essentially generate a wide 3-D point cloud (3-D geometrical coordinates to the image on a real-world scale) via two or more cameras placed parallel to each other. This can be used for several applications, like Obstacle Avoidance. This technology isn’t new, and we have been working with the RealSense cameras for a long time.
What’s new about this particular camera is its ability to run AI/Computer Vision algorithms onboard the camera. As you may be aware, most applications in Computer Vision and Drones require a heavy (potentially expensive) Companion Computer onboard. This is a considerable bottleneck for any user to experience this technology. Well, not anymore! Things like detecting humans, everyday objects, animals can be done at a relatively high refresh rate inside the camera (with no additional hardware required, just any low powered USB-enabled companion computer, like an RPi Zero). Even training the camera to detect custom objects for your tailor-made applications are super easy!
What makes this sensor even better for ArduPilot is the fact that the detected objects can be tracked in 3D space via the stereo camera. So in essence, the camera outputs the detected object and its 3D coordinates to the Host (The device where the camera is connected too). Alternatives like the RealSense series had to transfer the entire Image frames onto the host, which required USB3 (Terrible for EM noise!) and a relatively fast Companion Computer. Other neat features include: It has onboard scripting capabilities (much like Lua scripting for ArduPilot), inbuilt IMU, object tracker support via inbuilt EKF and other algorithms, amongst many many more features.
I would also like to mention some of the issues that I have faced with the camera: There are issues with focusing on the RGB camera when it is put in high-vibration environments. This is being addressed by Luxonis by launching a “fixed-focus” version of the camera in the near future. A comparison of fixed focus vs auto focus can be seen here:https://www.youtube.com/embed/jpnsTsCGbQk
The other issue is that the quality of the Depth Frames isn’t as good as the RealSense cameras. They definitely have more noise. I have found workarounds, though. The Luxonis team plans to launch OAK-D-PRO, which should again improve this.
OAK-D-IOT-75:
This is the EXACT same camera as the OAK-D. Except: it has an ESP32 (with WiFi and BT) on board, which has a perfect use case for us ArduPilot enthusiasts! The camera can directly talk to the ESP32, and the ESP32 can be used to interface with the Flight Controller (via the serial port). This means we can finally have Computer Vision/AI applications without needing any Companion Computer! Directly plug in the camera to your Flight Controller and no need for any other equipment!
This is the camera that I have spent the last few weeks integrating and making new example scripts for. If you are interested in my development and integration work, please read the second part of this blog for some interesting applications!
23 posts - 9 participants
Read full topic
[Less]
|
Posted
about 3 years
ago
by
rishabsingh3003
When was the last time someone showed you a camera capable of object detection like this, without using a companion computer or any other external hardware? Today I’d like to introduce a
... [More]
camera capable of this, and much more!
For the past few months, I have had the opportunity to work on a fascinating new Camera. It is the OAK-D series from Luxonis. I have been experimenting with the original OAK-D and the OAK-D-IOT-75 for a while now. In this blog (Part 1 of a series of blogs I hope to post), I want to give a small description of them:
OAK-D:
This is essentially an RGB-D Camera (Stereo Camera + RGB Camera system, much like the Intel RealSense D435i). If you are not familiar with Stereo Cameras, they help us essentially generate a wide 3-D point cloud (3-D geometrical coordinates to the image on a real-world scale) via two or more cameras placed parallel to each other. This can be used for several applications, like Obstacle Avoidance. This technology isn’t new, and we have been working with the RealSense cameras for a long time.
What’s new about this particular camera is its ability to run AI/Computer Vision algorithms onboard the camera. As you may be aware, most applications in Computer Vision and Drones require a heavy (potentially expensive) Companion Computer onboard. This is a considerable bottleneck for any user to experience this technology. Well, not anymore! Things like detecting humans, everyday objects, animals can be done at a relatively high refresh rate inside the camera (with no additional hardware required, just any low powered USB-enabled companion computer, like an RPi Zero). Even training the camera to detect custom objects for your tailor-made applications are super easy!
What makes this sensor even better for ArduPilot is the fact that the detected objects can be tracked in 3D space via the stereo camera. So in essence, the camera outputs the detected object and its 3D coordinates to the Host (The device where the camera is connected too). Alternatives like the RealSense series had to transfer the entire Image frames onto the host, which required USB3 (Terrible for EM noise!) and a relatively fast Companion Computer. Other neat features include: It has onboard scripting capabilities (much like Lua scripting for ArduPilot), inbuilt IMU, object tracker support via inbuilt EKF and other algorithms, amongst many many more features.
I would also like to mention some of the issues that I have faced with the camera: There are issues with focusing on the RGB camera when it is put in high-vibration environments. This is being addressed by Luxonis by launching a “fixed-focus” version of the camera in the near future. A comparison of fixed focus vs auto focus can be seen here:https://www.youtube.com/embed/jpnsTsCGbQk
The other issue is that the quality of the Depth Frames isn’t as good as the RealSense cameras. They definitely have more noise. I have found workarounds, though. The Luxonis team plans to launch OAK-D-PRO, which should again improve this.
OAK-D-IOT-75:
This is the EXACT same camera as the OAK-D. Except: it has an ESP32 (with WiFi and BT) on board, which has a perfect use case for us ArduPilot enthusiasts! The camera can directly talk to the ESP32, and the ESP32 can be used to interface with the Flight Controller (via the serial port). This means we can finally have Computer Vision/AI applications without needing any Companion Computer! Directly plug in the camera to your Flight Controller and no need for any other equipment!
This is the camera that I have spent the last few weeks integrating and making new example scripts for. If you are interested in my development and integration work, please read the second part of this blog for some interesting applications!
35 posts - 10 participants
Read full topic
[Less]
|
Posted
about 3 years
ago
by
rishabsingh3003
When was the last time someone showed you a camera capable of object detection like this, without using a companion computer or any other external hardware? Today I’d like to introduce a
... [More]
camera capable of this, and much more!
For the past few months, I have had the opportunity to work on a fascinating new Camera. It is the OAK-D series from Luxonis. I have been experimenting with the original OAK-D and the OAK-D-IOT-75 for a while now. In this blog (Part 1 of a series of blogs I hope to post), I want to give a small description of them:
OAK-D:
This is essentially an RGB-D Camera (Stereo Camera + RGB Camera system, much like the Intel RealSense D435i). If you are not familiar with Stereo Cameras, they help us essentially generate a wide 3-D point cloud (3-D geometrical coordinates to the image on a real-world scale) via two or more cameras placed parallel to each other. This can be used for several applications, like Obstacle Avoidance. This technology isn’t new, and we have been working with the RealSense cameras for a long time.
What’s new about this particular camera is its ability to run AI/Computer Vision algorithms onboard the camera. As you may be aware, most applications in Computer Vision and Drones require a heavy (potentially expensive) Companion Computer onboard. This is a considerable bottleneck for any user to experience this technology. Well, not anymore! Things like detecting humans, everyday objects, animals can be done at a relatively high refresh rate inside the camera (with no additional hardware required, just any low powered USB-enabled companion computer, like an RPi Zero). Even training the camera to detect custom objects for your tailor-made applications are super easy!
What makes this sensor even better for ArduPilot is the fact that the detected objects can be tracked in 3D space via the stereo camera. So in essence, the camera outputs the detected object and its 3D coordinates to the Host (The device where the camera is connected too). Alternatives like the RealSense series had to transfer the entire Image frames onto the host, which required USB3 (Terrible for EM noise!) and a relatively fast Companion Computer. Other neat features include: It has onboard scripting capabilities (much like Lua scripting for ArduPilot), inbuilt IMU, object tracker support via inbuilt EKF and other algorithms, amongst many many more features.
I would also like to mention some of the issues that I have faced with the camera: There are issues with focusing on the RGB camera when it is put in high-vibration environments. This is being addressed by Luxonis by launching a “fixed-focus” version of the camera in the near future. A comparison of fixed focus vs auto focus can be seen here:https://www.youtube.com/embed/jpnsTsCGbQk
The other issue is that the quality of the Depth Frames isn’t as good as the RealSense cameras. They definitely have more noise. I have found workarounds, though. The Luxonis team plans to launch OAK-D-PRO, which should again improve this.
OAK-D-IOT-75:
This is the EXACT same camera as the OAK-D. Except: it has an ESP32 (with WiFi and BT) on board, which has a perfect use case for us ArduPilot enthusiasts! The camera can directly talk to the ESP32, and the ESP32 can be used to interface with the Flight Controller (via the serial port). This means we can finally have Computer Vision/AI applications without needing any Companion Computer! Directly plug in the camera to your Flight Controller and no need for any other equipment!
This is the camera that I have spent the last few weeks integrating and making new example scripts for. If you are interested in my development and integration work, please read the second part of this blog for some interesting applications!
25 posts - 9 participants
Read full topic
[Less]
|
Posted
about 3 years
ago
by
rishabsingh3003
When was the last time someone showed you a camera capable of object detection like this, without using a companion computer or any other external hardware? Today I’d like to introduce a
... [More]
camera capable of this, and much more!
For the past few months, I have had the opportunity to work on a fascinating new Camera. It is the OAK-D series from Luxonis. I have been experimenting with the original OAK-D and the OAK-D-IOT-75 for a while now. In this blog (Part 1 of a series of blogs I hope to post), I want to give a small description of them:
OAK-D:
This is essentially an RGB-D Camera (Stereo Camera + RGB Camera system, much like the Intel RealSense D435i). If you are not familiar with Stereo Cameras, they help us essentially generate a wide 3-D point cloud (3-D geometrical coordinates to the image on a real-world scale) via two or more cameras placed parallel to each other. This can be used for several applications, like Obstacle Avoidance. This technology isn’t new, and we have been working with the RealSense cameras for a long time.
What’s new about this particular camera is its ability to run AI/Computer Vision algorithms onboard the camera. As you may be aware, most applications in Computer Vision and Drones require a heavy (potentially expensive) Companion Computer onboard. This is a considerable bottleneck for any user to experience this technology. Well, not anymore! Things like detecting humans, everyday objects, animals can be done at a relatively high refresh rate inside the camera (with no additional hardware required, just any low powered USB-enabled companion computer, like an RPi Zero). Even training the camera to detect custom objects for your tailor-made applications are super easy!
What makes this sensor even better for ArduPilot is the fact that the detected objects can be tracked in 3D space via the stereo camera. So in essence, the camera outputs the detected object and its 3D coordinates to the Host (The device where the camera is connected too). Alternatives like the RealSense series had to transfer the entire Image frames onto the host, which required USB3 (Terrible for EM noise!) and a relatively fast Companion Computer. Other neat features include: It has onboard scripting capabilities (much like Lua scripting for ArduPilot), inbuilt IMU, object tracker support via inbuilt EKF and other algorithms, amongst many many more features.
I would also like to mention some of the issues that I have faced with the camera: There are issues with focusing on the RGB camera when it is put in high-vibration environments. This is being addressed by Luxonis by launching a “fixed-focus” version of the camera in the near future. A comparison of fixed focus vs auto focus can be seen here:https://www.youtube.com/embed/jpnsTsCGbQk
The other issue is that the quality of the Depth Frames isn’t as good as the RealSense cameras. They definitely have more noise. I have found workarounds, though. The Luxonis team plans to launch OAK-D-PRO, which should again improve this.
OAK-D-IOT-75:
This is the EXACT same camera as the OAK-D. Except: it has an ESP32 (with WiFi and BT) on board, which has a perfect use case for us ArduPilot enthusiasts! The camera can directly talk to the ESP32, and the ESP32 can be used to interface with the Flight Controller (via the serial port). This means we can finally have Computer Vision/AI applications without needing any Companion Computer! Directly plug in the camera to your Flight Controller and no need for any other equipment!
This is the camera that I have spent the last few weeks integrating and making new example scripts for. If you are interested in my development and integration work, please read the second part of this blog for some interesting applications!
2 posts - 2 participants
Read full topic
[Less]
|
Posted
about 3 years
ago
by
OlivierB
Congratulations to Shiv Tyagi (aka shiv-tyagi on Github), for winning Ardupilot’s “Contribution of the Month” award for October!
Shiv, a relatively new developer within the Ardupilot community, was nominated by members of the dev
... [More]
team and won the prize for his contributions to Ardupilot code development. He has been busy monitoring Ardupilot issues list, and has authored many sucessfully merged pull requests addressing some of them.
Thank you Shiv, your work and contributions are much appreciated by the dev team and the Ardupilot community at large!
The prize for this month was $250 for the top contributor.Thanks to those who donated to ArduPilot, including our Corporate Partners . If you are a company and wish to donate a prize for an upcoming month, please email the partners email list ([email protected]).
Congrats again!
3 posts - 3 participants
Read full topic
[Less]
|
Posted
about 3 years
ago
by
Artem_Skorsky
I have been working on Loko for a long time, now first version of Loko coming soon . Also it is Open source, who want to to join , Wellcome.
Loko- The smallest GPS tracker for drones , FPV racers, RC planes
1 post - 1 participant
Read full topic
|
Posted
about 3 years
ago
by
TomSeymour
https://www.youtube.com/embed/sjk0GyHpUS4
The Explora is up and running with standard RC and goes like a dream over my little obstacle course. It’s able to drive over just about anything I can point it at and is even able to go up and
... [More]
down some pretty steep steps, especially with the help of the build-in traction control mode. I think it would almost drive up a wall if I asked it to.
Up next is fitting a Pixhawk flight controller and getting the arduRover to perform some autonoumous navigation with the GPS.
Stay tuned.
Previous posts:
Part 1 - Project: Rough terrain navigation using Deep Reinforcement Learning Part 1 - #9 by TomSeymour
Part 2 - Project: Rough terrain navigation using Deep Reinforcement Learning Part 2 (video thread)
2 posts - 2 participants
Read full topic
[Less]
|
Posted
about 3 years
ago
by
TomSeymour
https://www.youtube.com/embed/sjk0GyHpUS4
The Explora is up and running with standard RC and goes like a dream over my little obstacle course. It’s able to drive over just about anything I can point it at and is even able to go up and
... [More]
down some pretty steep steps, especially with the help of the build-in traction control mode. I think it would almost drive up a wall if I asked it to.
Up next is fitting a Pixhawk flight controller and getting the arduRover to perform some autonoumous navigation with the GPS.
Stay tuned.
Previous posts:
Part 1 - Project: Rough terrain navigation using Deep Reinforcement Learning Part 1 - #9 by TomSeymour
Part 2 - Project: Rough terrain navigation using Deep Reinforcement Learning Part 2 (video thread)
3 posts - 3 participants
Read full topic
[Less]
|
Posted
about 3 years
ago
by
rmackay9
https://www.youtube.com/embed/V8N3lA-20fs
I’d like to share the results of an autonomous boat project I’ve been working on with AttracLab, Shimane University and a company called Lighthouse. This project is sponsored by DeSET and the
... [More]
Nippon Foundation which promotes technologies to assist with their goal of mapping all the world’s oceans by 2030 with a resolution of at least one depth reading per 100m x 100m square. Currently only about 20% of the ocean has been mapped to this level of detail. The more specific goal of my team (Team2) is to lower the barrier to entry of creating drone boats.
I’m not trying to sell this boat, instead it is a prototype to prove what is possible and provide a reference frame so others can replicate it. Here are some details:
Frame provided by AttracLab and is about 185cm x 30cm x 80cm and is made from FRB. I wish this frame were readily available for anyone to buy but at least for the moment the original manufacturer (who is not AttracLab) needs convincing that there is enough demand for this exact frame before they produce more of them
Power comes from a Torqeedo 1003c electric motor which produces about 2HP. There are two supported battery configurations, either the 0.915kWh Torqeedo battery that comes with the motor or 6x Panasonic ebike batteries (2.4kWh total)
Hondex HS-E8 sonar with 200m range and custom firmware (provided by Hondex) to provided depth and backscatter data over NMEA. Surprisingly this sonar’s NMEA interface normally only provides the GPS data so I would recommend to anyone replicating this setup to use one of the other sonars ArduPilot supports
The autopilot is a Cube Gold (same as CubeOrange) running ArduPilot Rover-4.2.0-DEV
Telemetry and manual control is via HereLink
Mission Planner was used for checking on the vehicle’s status in real-time, mission planning and moving the vehicle around in Guided mode
We tested this boat for four days in Okinoshima Japan and it all went surprisingly smoothly. 73km of sea floor was mapped (measured by how far the boat traveled) which was totally acceptable considering the daily time constraints and that this was the first time the boat had touched salt water. Below is a the result after the data was processed by ReefMaster.
Some impressions and things we learned during testing:
Despite its higher cost, I’m very happy we switched from the original gas engine to the Torqeedo. Beyond its linear feedback, quiet operation and low “fueling” costs, this motor provides tons of information including RPM, voltage, current and any errors and the new AP driver takes advantage of this to keep the overall system working reliably while also keeping the pilot informed of what’s going on. The only thing I don’t like about the motor is the 2 or 3 seconds of lag when it first starts rotating. I’ve heard this is to protect the motor from a sudden increase in current but I’ve never seen another electric motor/ESC combo require this and it makes the speed control more difficult when the vehicle first starts moving. Still, I highly recommend using the Torqeedo over a simple DC motor or gas engine.
Boats with displacement hulls consume a lot of energy at high speeds! Below is a rough graph produced from a back-and-forth test at various speeds with a half depleted battery (so it’s not super accurate). The increase in consumption above 2.0m/s (7.2km/h) was especially noticeable and we eventually settled on 1.8m/s (6.5km/h) as the best balance between energy consumption vs time available. I.e. too fast and you run out of battery, too slow and you run out of time.
The boat handled rough conditions very well but its yaw control was worse than expected when traveling downwind. I suspect the problem is the throttle output was very low and the default motor thrust scaling does not match the nearly linear thrust of the motor.
This frame is incredibly agile, probably more than is really required of a mapping boat that spends most of its time traveling in straight lines. A catamaran hull might have been a better choice because while not as agile it might be more efficient.
Of course we hit some small issues during the week that we hope to resolve in the near-ish future:
The boat’s mid and aft hatches are screwed down instead of being hinged meaning it requires a drill and a few minutes to open or close
On Day2, one of the Panasonic ebike batteries shut down while the vehicle was being aggressively driven in Manual mode. The boat kept going and in fact we didn’t notice the failure until the end of the day but it meant the capacity for that day was only 2kWh instead of 2.4kWh. In general I like these batteries because they are a good size (0.4kWh each), easy to carry, easy to buy and easy to charge. They also have built in safety features like short-circuit protection but the flip side is they shutdown if the current draw increases too rapidly or the overall current flow is too high. Also because they don’t provide a digital interface there is no way to directly know if they have failed.
The Panasonic batteries have a lower voltage (28V) than the Torqeedo batteries (32V) meaning the vehicle’s top speed is slightly lower than it could be. This issue is especially pronounced after the batteries start getting low. Finding an alternative battery of 32V seems nearly impossible but the Torqeedo motor appears to be able to accept up to 40V (only bench tested) so some other 36V batteries may be a good option (Makita power tool batteries maybe?).
IMU Gyro calibration (run automatically soon after startup) often fails on boats if they’re already in the water (see issue here)
Rover firmware support for DO_LAND_START mission commands would have made returning to “home” easier than using Guided mode or manually advancing the mission to the last few waypoints.
ArduPilot’s support for boats is already quite good but here are some ways it will improve further:
SCurve support for Rovers and Boats. This should improve cornering and allow reducing the aggressiveness of the corners by simply increasing the WP_RADIUS parameter
Add support for the larger Torqeedo motors which use a CAN interface
Add support for the ePropulsion motors. By the way if anyone has a contact at ePropulsion I would like to talk with them about how to get details of the protocol they use.
Improved object avoidance using 3D AI cameras (like the OAK-D), radar or underwater scanning sonar
Support side scan sonar
I’d also like to replace this frame’s steering system with a 3D printed alternative so that others can more easily integrate the Torqeedo into their frames. The start of a design is here.
… and finally eventually I’d like to add solar panels to improve range. This should actually be quite easy if the Torqeedo battery is used (just plug the panel in) but it is not clear to me how to do it if other batteries are used.
All feedback and advice is greatly appreciated!
11 posts - 2 participants
Read full topic
[Less]
|