UGV - Lidar direct code [success, eventually!]

Okay, whilst mental hopping between a few things, I decided to "go back to basics" and remove the Docker and ROS2 aspects out of the equation.


Why couldn't I go back to the "bare metal" on the Raspberry Pi 5? and just use Python (yes, I know, I know) to read and write to the /dev/ ports. To save the UGV driving off the foot stool, I propped it up on some old books that were propped against a wall.  This free's the wheels up so they can move without driving off the top!

As it's a SUNDAY, I'm having a relaxed day, as you can tell!



I found quite a few lines of code on the good old inter-webs & of course, it all works "perfectly fine" for everyone else, of course it does. sigh.

It's just me, then is it?  I did wonder about this & I tried to find real videos / photo's / website that show the LiDAR actually working with the UGV, y'know what, yep - couldn't find anything.  Hmmm.... I started to scratch my beard at this point (hey, it's been xmastime, it's time to let it grow until next week)


Here's a DEBUG output from the Python code, as you can see I am able to decrypt the lidar binary data into HEX that can then be processed.

https://forum.youyeetoo.com/t/ld19-process-the-signal-from-the-lidar-directly/507/4


As you can see, there is no ERROR accessing /dev/ttyserial0 (for UGV Rover access) or the /dev/ttyUSB0 (for the Lidar), the lidar data is retrieved, sometimes it looks a little odd, but the code to determine an ACTION is executed and the command to send to the UGV itself is output as per the JSON format to make the UGV move.


ATM, the code executes, but the UGV is not responding.  I am wondering if it was plugged in properly when it was taken apart a few weeks ago???  I'll need to see if I can send commands to the UGV outside of this code and see if the jupyter notebook code can make it move.  anyway, that's for later.

btw - this was the original code that put me in the right direction: https://github.com/KD5VMF/Waveshare-Rover/tree/main


Why do I make the statement about the CrcTable potentially being wrong? Well, from this website: https://github.com/LudovaTech/lidar-LD19-tutorial

I discovered the following (see image below) - now, THIS might be why all the ROS2 driver code fails to work properly?  I might need to write my own (or lift this code & write my own component node)



I'll try this code change shortly, however, for now, I'm going to take the UGV apart and check all the connections....



UPDATE: I'll be back.

Firing up the Jupyter notebook and checking out the CHASSIS example - then running it and "the wheels on the bus went round & round" - indeed they did.  So, what was different?

AH HA!


As I'm using a Raspberry Pi 5 device, the code should be using /dev/ttyAMA0 and NOT /dev/serial0

Let me go change that real quick :-D

Hmmm... doesn't seem to have made any difference.  

The THEORY was right though and the output implies that it's all okay.  Something is slightly amiss.

Maybe I need ANOTHER cup of tea?

BRB

UPDATE UPDATE:

okay, I figured it out.  The SPEED values being passed to the L and R values was too high.  I need to fine-tune it, but for now it is working with some simple smaller values.  A quick commenting out of a load of the print() values:


and in true Dawn Ahukanna style, "it didn't happen if there wasn't a video", here you go:


I probably need to unplug from the monitor, keyboard & mouse and see how the UGV Rover behaves when "on the ground".


I'll paste the code for test1.py into a github repo, so it's not lost & with these learnt lessons.



Now, back to ROS2 :-D 





btw - to give you an example, here's a list of all the web-pages that I have open across several environments:

https://wiki.ros.org/velodyne/Tutorials/Getting%20Started%20with%20the%20Velodyne%20VLP16

https://github.com/waveshareteam/ugv_rpi

https://github.com/waveshareteam/ugv_ws

https://github.com/ldrobotSensorTeam/ldlidar_stl_ros2

https://www.waveshare.com/wiki/D500_LiDAR_Kit

https://github.com/ArendJan/LD_LIDAR_python

https://github.com/ldrobotSensorTeam/ldlidar_stl_ros2/issues/2

https://github.com/ldrobotSensorTeam/ldlidar_stl_ros2/issues/12

https://www.waveshare.com/wiki/DTOF_LIDAR_LD19

https://github.com/Myzhar/ldrobot-lidar-ros2

https://github.com/YahboomTechnology/DTOF-mini-LiDar/tree/main

https://github.com/Gafarrell/PiLD19/tree/main

https://github.com/KV1999/ld19_lidar

https://github.com/halac123b/Visualize-data-from-Lidar-LD19_Matplotlib-Python

https://github.com/zaki-x86/LiDAR-object-detection/tree/master

https://forum.youyeetoo.com/t/ld19-process-the-signal-from-the-lidar-directly/507/4

https://github.com/LudovaTech/lidar-LD19-tutorial

https://github.com/sige5193/bittly

https://github.com/richardw347/ld19_lidar



ooo! look what I found.  This implies number [11] has a connector to plug the LIDAR directly into the board & not via the USB


although it does imply that power (+/-) is provided, so it might be do-able.  I wonder if this would make the /dev/serial0 the place to control it from instead of /dev/ttyACMA0

Ah, okay, I also wonder what that would do - for instance, I "believe" the current board that the lidar connects to is an ESP32 board that is running some code and oddly (need to ask Chris C. where he got this one from?!) this board is different to the one I just got in the post - (I got the board & cables not the lidar) note how this one is smaller and although it looks like it is just doing a UART (Serial) to USB convertion, I think it's also running some code to convert the lidar data into a format that can be processed by the RPi device.  I literally could be wrong here, but the Arduino code I found implies this.

So...if you now remove that board out of the equation, do I now have to do the conversion / processing of that data myself? If so, would that be on the driver-board above (if so, how?) or would it be passed through to the RPi5 that would then eat up a chunk of processing there?

This might be why this is offloaded to an external board itself.

Now, where did I put those screw drivers? well, that wasn't long before getting them out again!?!


I am VERY tempted to build another one of these robots:

I noticed I can get the base (including the driver board) for £135 (+P&P)

https://thepihut.com/products/6x4-off-road-ugv-kit-esp32-driver

Then a lidar for £130 (+P&P but could be included with above)

So that is £235 so far.

Then I could get the camera for £137 from Amazon (or £118 from ali-express)

https://www.amazon.co.uk/gp/product/B0CRB41TXZ/ref=ewc_pr_img_1?smid=A3U321I9X7C9XA&psc=1

that comes with "another" controller board, "just incase"

I already have spare RPi4 and RPi5 devices I can slot into it.

Instead of £432, I could it in bits for £135+130+137 = £402 (+P&P)

hmmm....okay, maybe not such a good bargain after-all.  

However, that base for £135 sure is tempting.


Then I'd just need to find a cheap lidar missing some cables, oh "Hello eBay" - there's one for £17 (inc P&P)! okay, so it's not the LD19, it's the earlier LD06, but the drivers are similar / same : oh sod it for that price, I'll buy the lidar, I can use it on "other things" if I need to.

hang on, that now makes it:

£135+17 = 152 (+P&P)

£152 + 137 = £289

oh, now we're in the realms of possibilities.  Maybe I should just get the base robot & look to get the camera "later", hmmmm

From Waveshare itself, the cost without an RPi is £340

It is about a £50 saving... but do I "really need that amazing gimbal camera"? just yet?

Oh Ali-Express, what are you doing to me? £95


£152 + 95 = £247

That is about a £100 "saving". lol. I like the justification there, I'm "saving" £100 by spending £250 that  I haven't got. sigh.

Right, time to close the laptop & go do something else, as this could get VERY expensive all of a sudden!

OMG! I also "need" these as well....


* must resist * must resist * got more important things to spend money on, like a campervan!!!!


UPDATE: I bought the cheap lidar and will ponder the PiHut base unit next week.



Comments