RoadStar | Crowd-based Assessment of Road Surface Quality

Background

Bad road surfaces are are physical stress for drivers and car. They reduce the travelling comfort, increase the fuel consumption and  create noise pollution. As a result,  the measurement and repairing of roads is a regular task for road traffic departments everywhere. Detecting bad surfaces is mostly done with specialised measurement cars. This procedure is often both expensive and time consuming, and creates a financial problem. As a result, repairs are often delayed or not done at all.

Thanks to research in the area of cyberphysical systems, it is nowadays possible to integrate measurement devices in any car. Using a crowdsourcing approach, one could give away these measurement devices and then collect and evaluate all the measured data in a central instance. This data could then, for example, be used to create better road models which would allow faster repairs, or to improve navigation to avoid “bumpy roads”.

Goal of Roadstar

Goal of this project is it to develop and test a system, which allows simple measurement of road surface data while driving, and creating a system for storage and evaluation of this data afterwards. Common hardware, such as Rasperry PIs, gyro- and gps-sensors should therefore be combined into a portable measurement system.

Core concepts which should be implemented, are the synchronisation, transmission, storage and basic evaluation of the measured sensor data. With these sensors, data should continuously be measured at different points of the car. The collected data then should be synchronised and transmitted to a central server.

On the server, the data should be classified using existing algorithms, and transformed into a form which can be displayed on a simple web frontend.

Thank You And Goodbye

The practical course Internet of Things at TU Dresden has nearly finished. The final presentation (PDF: Final_Presentation) took place on Friday. We would like to thank our students for participating the course and developing an impressive solution based on multiple components and techniques to measure the quality of road surfaces. We are looking forward to bring the results to bear. We are still in talks with different companies that allow us to test the outcomes in a wider range. Stay tuned!

Final User Interface

During the last two weeks I worked on the final design of the user interface.

You are now able to watch the sensor data in 3 seperated graphs for each test ride. You can select the different values, you want to see by enabling or disabling them via checkboxes. In addition to that you’re now able to switch between raw and matched view, either visualized as lines, or as points. Because of having only two quality categories, we removed the five quality-selectors I had introduced some time ago in a blog post.

All in all the UI has now its full functionality. 😉

First threshold to classify “good” and “bad” roads

After visualizing the sensor data of a 25 time frame (from our first test drive) which contained both, good and bad quality roads we decided that the road quality can be extracted by only using the “z-acceleration” – at least for a first threshold to classify between two simple qualities: “good/okay” and “(really) bad”.

(The green section in the first chart marks a good quality road, the blue sections a “normal” quality road and the red section a “(really) bad” quality road.)
accel-and-marks

For references: charts for gyroscope and rotation data:
gyro

rota

 

Finding a first threshold

Then we combined our road quality notes of the first test drive with the “Z-acceleration” data of the whole test drive and decided to use 1.4 as a threshold to differentiate between regular quality and “pothole measurements”:

21:49 “bad road”
received_2015-06-24_19-42-13

20:52 “really bad road”
received_2015-06-24_18-51-17

21:05 “quite good road”
received_2015-06-24_19-00-38

21:23 & 21:25 “really bad road”
received_2015-06-24_19-18-59

First test drive in real life!

On wednesday evening Armin, Stephan and me went out for a test drive. For this purpose we rented a car from a local car sharing company called teilAuto. We met at the informatics faculty building and set up the whole measurement hardware.

IMG_20150624_203240782

During the test drive Armin was watching the gathering of the data on his laptop. He also checked the hardware regulary.

IMG_20150624_220016416

We tried to cover a large part of Dresden und also passed some possible difficulties like roundabouts, passing the same junction/streets several times or driving through a tunnel. Except some problems with the timestamps and a short blackout because of too high acceleration everything went pretty fine.

Now it is time to evaluate these data!

Controlling Our Measurement Device

After adding the status LEDs and the control button we implemented the control logic:

IOT15_Controlling_new

Sensing: As soon as the GPS and sensor data of all sensors is available (in real-time) at the server-PI it saves the data in CSV files (see this post for details).

Uploading: When the messbox is in “Ready”-state, CSV files exist and the web server is responding to a ping the server-PI tries to upload the CSV files through Websocket. When the upload is successful the CSV files get deleted.

Small Upgrade On The Hardware Side

Our measurement device got a small hardware upgrade: Two status LEDs and a button:

RPI2_LEDs_Button_Schaltung

With R1 = U(R1) / I = (U(GPIO) – U(LED)) / 0.010A = (3.3V – 2.0V) / 0.010A = 130 Ohm

We connected the button to GPIO 5 with a pull-up resistor  (10kOhm). This way the Raspberry PI can be booted with the button (no software code needed).

USB Switch

Currently only our RPIs are powered by battieres – our switch is connected to the car on-board-socket (through a converter). If the whole messbox needs to run with battery packs only there are adapters available to plug your low-power switch to a regular USB port (battery pack).

In our case (D-Link DGS-1008D) we need 7.5V, 1A to USB adapter – available e.g. here(1).
(1) This was the only one we found online but the seller confirmed that it works for our device: “Yes, I just checked and our adaptor B00HM59B4C will suit the D-Link DGS-1008D. The current draw will be fine and the tip is the correct size.”

Aggregating geodata using geohashes

What we want to do

The situation is the following. We have a set of points with gps coordinates. We want to visualize them in the browser, but with large datasets (e.g. millions of points), displaying all of them is not what we want to do. Instead we want to aggregate them, in order to display multiple points being near each other as one single point.

The following example will be using MongoDB and a geocode system called geohash

What is Geohash

Geohash is a representation of latitude/longitude coordinates as a unique hash.

An example: The geohash of 57.64911,10.40744 would be u4pruydqqvj.

The geohash has a useful characteristic, which is that the closest points are often the ones with the closest geohash, means that they have the most characters in common.

How we use it

The fact that the closest points are often the ones with the closest geohashes allows us to create a database query which aggregates multiple points into one.
In order to aggregate our data points we will be “adding” a field which is the geohash shortened to a certain number of characters. This number of characters depends on how big of an area we want to aggregate into one point.
For example, a precision of 5 characters would represent an area of about 4.9km x 4.9km, whereas 9 characters would be 4.8m x 4.8m (according to elastic.co)

  • Let’s say our stored data has the following structure and is stored in a collection called ‘point’:
      {
          '_id' : 'somemongodbobjectID',
          'gps' : [57.64911,10.40744],
          'geohash' : 'u4pruydqqvj'
      }
    
  • In order to aggregate all points we are using mongodb’s aggregate pipeline.
    1. The first step is it to add the shortened geohash as a field. We will just call it shortGeohash. This is done using mongodb’s $project pipeline stage and the $substr operator, which allows us to create a substring of an existing field. $substr takes 3 arguments, where the first is the field to create the shortened string of, and the other twos define where to slice the string.
       db.point.aggregate([
           { $project : {shortGeohash: {$substr: ["$geohash", 0, 9]}}},
       ])
      
    2. The second step is two aggregate the points which have the same shortened geohash, means they are located in the same area of 4.8m x 4.8m (in this example). In order to know which documents are grouped together we also $push the documents before the $group stage with the $$ROOT operator.
       db.point.aggregate([
           { $project : {shortGeohash: {$substr: ["$geohash", 0, 9]}}},
           { $group: {_id: "$shortGeohash", count: {$sum:1}, originalDoc:{$push: "$$ROOT" }}
       ])
      
    3. This will result in an array of documents grouped by their geohash. We now have the points aggregated by their location and we have an array of IDs, which will allow us to do further work on our aggregated data.
       [{ _id: "u4pruydqq",
           count: 2,
           originalDoc: [{
               "_id": "5579b75416b8101ca37d9ab0",
               "shortGeohash": "u4pruydqq"
           }, {
               "_id": "5579b75416b8101ca37d9ab1",
               "shortGeohash": "u4pruydqq"
           }] 
       }, { _id: "u4pruydqr",
           count: 5,
           originalDoc:
           [{
               "_id": "5579b75416b8101ca37d9ab2",
               "shortGeohash": "u4pruydqr"
           }, {
               "_id": "5579b75416b8101ca37d9ab3",
               "shortGeohash": "u4pruydqr"
           }, {
               "_id": "5579b75416b8101ca37d9ab4",
               "shortGeohash": "u4pruydqr"
           }, {
               "_id": "5579b75416b8101ca37d9ab5",
               "shortGeohash": "u4pruydqr"
           }, {
               "_id": "5579b75416b8101ca37d9ab",
               "shortGeohash": "u4pruydqr" 
           }]
       }]
      

Additional UI Elements

Stephan and me added some additional UI elements to the settings dialogue of the Roadstar webview.
At the moment, the elements are not yet connected to any functions. This will happen the next week.

We added some radio buttons for line or point view  and a slider, where the density of the showed data can be chosen. In addition to that, you can decide whether to see raw data or map-matched data. Finally we created some smileys for the different qualities. By clicking on them you will be able to choose different quality filters.

In the “About” dialogue, we added a link to this dev blog.

Bildschirmfoto vom 2015-06-15 20:56:42

Creating the Front-End

As we move along with the project, its time to look at the Front-End. We want a clear, responsive design that is accessible for the normal user.
We use Leaflet, MapBox, jQuery and Chart.js to visualise our data.
This is the first draft and there is still a lot to do. We need to take a look on what features are needed, and how to implement them in a fast way.
The groundwork is there, now we need to handle the communication with the server and find a good way to visualise quality and state of the roads.

Bildschirmfoto - 09.06.2015 - 00:58:14

Bildschirmfoto - 09.06.2015 - 01:00:55

Testing the current state of the messbox

After working on the server-PI and sensor-PIs separately and often virtualized we decided that it’s time to put the messbox together and see what happens.

iot_glue_setup

The PIs boot and connect automatically. The sensor-PIs directly start sensing and forward the data to the server-PI who reads the GPS data and writes the received sensor data to csv files as soon as all sensors are connected.

iot_glue_screenshot

Here you have a sample csv log file which contains a quake around the timestamp 707020657 (line 844): sample csl file

The format of the csv file is:

sampleID, timestamp, sensorName (GPS), time, lat, lon, speed, course.
sensorName (FL), gyroX,gyroY,gyroZ,acceX,acceY,acceZ,rotaX,rotaY,rotaZ,
sensorName (FR), gyroX,gyroY,gyroZ,acceX,acceY,acceZ,rotaX,rotaY,rotaZ,
sensorName (BL), gyroX,gyroY,gyroZ,acceX,acceY,acceZ,rotaX,rotaY,rotaZ,
sensorName (BR), gyroX,gyroY,gyroZ,acceX,acceY,acceZ,rotaX,rotaY,rotaZ,
measurementID

And here a chart of the test quake:

received_2015-06-07_19-56-44_chart