Above is the test ground used this morning. The ground below the tarmac surface contains buried items to be detected. Sadly, due to the presumed depth of these items, marked up in chalk at being around 2 metres in places, this present vehicle stood no chance of detection. It did though detect a manhole cover nearby when it veered off piste! It also found ‘something’ on the edge of the ground but we remain unsure as to what that was.
The video below, therefore, demonstrates the vehicle in motion using the tried and trusted washers which could actually be buried slightly but here are not!
One can see the detection of one of the ‘clusters’ here which have been found by employing one motion strategy then, once detected, the subsequent strategy is employed to attempt to detect other members of the cluster. This second strategy had to be honed (calibrated) ‘in the field’ but this was all possible through the use of a ‘smartphone’ although the vehicle then executes autonomously.
There was plenty of discussion for future work during this exercise. There’s still some way to go, in a nutshell, but from small acorns…
Some last minute nerves about Op Demon Chaser. Increased sensitivity of detection so it’s right on ‘the edge’. Detection around 4/5 cm dependent upon what’s being detected. All hardware now pretty much complete including ‘bumpers’ and an ‘auto drive’ switch addition just in case a signal proves difficult? Software needs to play catch up. Testing imminent. Flying by the seat of one’s pants, as per usual. ✌️
Soon hope to test the vehicle at a purpose built test facility. Liking this spot though in the meantime (or somewhere similar) to perfect behaviours. By planting numerous pieces of metal (washers) it should be possible to see if new detection software does as expected? There’s of course still some way to go yet but it is important to be testing at an early stage so that reality has as much influence upon design as is practically possible.
Although a substantial number of ideas remain theoretical for this project the current results are still rather pleasing. It’s reassuring to see concrete results.
As the ideas begin to now gather momentum (excuse the pun) and eyebrows begin to correspondingly be raised (which previously seemed bereft of emotion besides a little mockery) maybe more support than was previously forthcoming will start to trickle through? Seed funding anyone? We’ll see. The jury’s still out as to whether that is wanted, frankly.
However, the message is: if you doubt someone’s impassioned ‘vision’ from the perspective of your own predispositions be prepared to eat humble pie as reality can create harsh bedfellows of those embedded in the past and those who dream of a future.
All that said, there’s still time for the project to fall flat on its robotic face. At least it tried and flew in the face of conservative tradition. Always a welcome force in the world.
The above is a video of an autonomous robot “Bella” using hand-crafted “SADHU” software to detect metal (two pence) coins on the beach. Out of 17 coins there are 4 hits. You can hear a beep as each hit is registered. Swarms would do far better… 😉
Additionally the screenshot below taken from an Android ‘control’ phone displays the tally of hits (in a basic debug mode [abridged])?
More developments will build upon this success. There’s so much more to say about this and ever more to do but, for now, let’s live a little. Peace.
Tried BoT BoT on a new surface. Interesting results. Calibration to adapt to surfaces (traction and unevenness) may prove essential. The additions of a proposed GPS and potentially an IMU may make life easier though bump up costs. Certainly swarms of BoTs (the ultimate intention) would be more on task. All in all portability of this unit has been proved in addition to field operations and maintenance.
No active sensors (yet) – purely navigating based upon timing of motors, etc. All the same, a basic search pattern can be achieved. Next up connect and write software for the sensors to detect objects in the environment. Finally, the operations are executed from a smartphone but run autonomously onboard BoT BoT! 🙂