A Raspberry Pi IoT to collect and post field data

This is the third blog post in a series that uses the Raspberry Pi as an Internet of Things device to collect environmental data (with a DHT22 weather sensor) and post the data in real-time to an ArcGIS Online feature service.  The post below is a highly synopsized version of a chapter in the forthcoming third edition of the book, Tech-enabled Field Studies.

Begin by setting up a Raspberry Pi with the DHT22 sensor..  See The Pi Becomes a Data Collector

Use the Pi’s command prompt to make the following installations and configurations:

  1. pip3 install virtualenv
  2. virtualenv datalogger
  3. cd datalogger
  4. source bin/activate datalogger
  5. pip3 install Adafruit_DHT
  6. pip3 install pandas
  7. pip3 install arcgis – -no-deps

Note: Step 6 above (Pandas) may take several minutes. The above steps create a virtual environment in Python called datalogger and install three libraries into that environment including a streamlined ArcGIS API for Python.  When you call the weather script below, you will need to reference it from this environment, described below.

Continue the set-up by:

  • Configuring the ArcGIS Online service as described in the previous blog post, A Time-aware Geo Data Bucket.
  • Download the sample Python file that reads data from the DHT22 and posts to ArcGIS Online. The Python file is here or download/clone the entire project here.  be sure to update the python file with your information, as described in the file.
  • Put the Python file in your datalogger directory to help you remember that it must be run from that environment.  Once you have added your data to the python script and have the ArcGIS Online service running, you can run the script by using the command:
    • /home/pi/datalogger/bin/python3 addwx.py

The service does have a date/time attribute and the service can become time-aware.  Once the service is added to a map, you may also choose a symbology that you prefer.

Finally, on the Pi, you may choose to set-up a Cron job to schedule the script execution on a reoccurring basis.  Do a quick Google search to find tutorials on how best to do this.  Enjoy!

 

 

 

The Pi Becomes a Data Collector

The Raspberry Pi can connect directly to sensors.  From temperature and humidity to light and motion to sound and well beyond, these add-ons can be sold individually or in large kits.  This ability to read and process data from a sensor turns your Raspberry Pi into an instant IoT device – for under $50.

Get a Sensor

In this project, a 3-pin DHT22 sensor with capacitive humidity sensor and a thermistordht22 (for temperature  and humidity) will be used.  If ordering from Amazon you can either get sensors cheap or fast – bur rarely both.  I purchased the $9.99 Gowoops version (2 sensors at $5 a sensor) of the DHT22 and received it in about two days.  I also purchased the $3.68 Sodial version (1 sensor) and received it in about one month. Make sure the sensor you order comes with cables/wires to connect the sensor to the Pi.   If you’re lucky enough to have a MicroCenter or similar store in your city, you might also look there.  There doesn’t appear to be much difference in sensor quality (in my two sensors).  What I paid for was speed of delivery.

Pi Compatibility

The easiest way to connect a sensor to a Raspberry Pi is by using the built-in header, available on current Pi models (except the Pi Zero).  On the Pi 3 B+, the basic GPIOPi3BPlus header is a 2×20 male pin connector panel that is built-in to the Pi’s board (visible on the top edge of the board pictured).  Each pin has an address; some pins are designated for data input or output, others for power, and some are ground.  Connecting the sensor is a matter of getting the three DH22 wires connected to the correct pins.

Generally for this type of project, I would recommend the Pi 3 B+, due to its processor speed and built-in hardware.  A Pi Zero for example does not have a built-in header and one must be soldered in place first.

Connect the Sensor

  1. Turn the Pi off and unplug the power.
  2. Remove the case. Position the Pi’s board so the header sits at the top edge (away from you).  Look at the GPIO header diagram below. Locate pin 1, which is on the left side of your board in the row closest to you.GPIOB-Plus
  3. Connect the 3 colored jumper cables from the sensor to the GPIO.  Cable colors will vary.  Technically, on the header, it doesn’t matter which power, ground, or GPIO pins you use, as long as you know the GPIO number of the data cable.  It will need to be used in your test program.  I recommend:
    1. Power cable: sensor (+) to header pin 1
    2. Data cable: sensor (out) to header GPIO4 (same as pin 7)
    3. Ground cable: sensor (-) to header pin 6
  4. Put the case back on the Pi.
  5. Connect the cables to the DHT22, as noted above.
  6. Plug the power back into the Pi and turn on.

Install the Software

Let’s make sure the sensor is working.  In order to use this or any sensor, you’ll need to install software (a python library) that can read and interpret the data coming from the sensor. In this case, we’ll install the Adafruit_DHT library and then write some short Python code to output the sensor data.

From here, I assume you have any version of Python 3 installed.  It’s installed by default, so unless you removed it, it should be a safe assumption.

Start Terminal. By default, this launches a black window with the bash prompt.  At that prompt, tell Python to add the code to read the sensor data. Text is case-sensitive.

  • pip3 install Adafruit_DHT

The script will download and install software and hopefully give you a confirmation everything went well. Techie note for conda users: The library doesn’t appear to be loaded into the Anaconda cloud.  Download and manually install it (like we did with arcgis, if you need the sensor exposed to your conda environment.

Test the Sensor

You need a Python IDE (integrated development environment), a place to write and run code. While most Pis already have at least one Python IDE, you may find it necessary to download one from: Start menu > Preferences > Recommended Software.

Thonny and Geeny were both pre-installed on my Pi and work fine for this task.  Python 3 IDLE is not advisable for this project. With Thonny open, create a new file and copy/paste the Python code below (or download the .py file from here). github

  • import Adafruit_DHT
  • sensor = Adafruit_DHT.DHT22
  • #This is the GPIO Pin number, not just the Pin number.
  • #This pin in pin 7 but sits in GPIO Pin 4. Use 4 below.
  • pin = 4
  • humidity, temperature = Adafruit_DHT.read_retry(sensor, pin)
  • temperature = temperature * 9/5.0 + 32
  • if humidity is not None and temperature is not None:
    • print(‘Temp={0:0.1f}*F  Humidity={1:0.1f}%’.format(temperature, humidity))
  • else:
    • print(‘Failed to get reading. Try again!’)

 

A few notes on the code:

  • Lines starting with # (hashtags) are comments.
  • The indentation on the two print lines is important. Just tab both lines in one time. This is how Python articulates logic and code “flow” (with tabs).
  • Note the import statement at the top.  This is where the Adafruit library is loaded by Python and starts the magic.
  • Most importantly, note the pin= line. This is the GPIO# not the actual pin #.

This sample script is derived from Adafruit’s simpleTest.py script.  I’ve just simplified it a bit but you can find the original here.

If the script runs correctly, the output should look something like:

  • Temp=67.5*F  Humidity=40.1%

Extension Ideas

  • Put a for loop around your code and make it run repeatedly.
  • Put a time.sleep() function in the for loop to space out sampling events.
  • Log your data.
  • Calculate the heat index, based on the temp and humidity data.

 

 

GeoInquiries Map and Data Search

The experimental GeoInquiry search tool allows an instructional designer or advanced GeoInquiry user to search GeoInquiry maps and data for keywords (and optionally add that data to their own map).  The tool includes all Esri produced geoinquiries but also state GeoInquiries content.

For example, a search for “climate” will search the metadata (including title and description) of about 150 maps and nearly 1,000 data services while ignoring all non-GeoInquiry content in ArcGIS Online.

The database behind the tool is updated twice monthly to reflect the latest changes in map and data service metadata. Explore the tool at http://edgis.org/geoinquirysearch

screen+shot+2019-01-22+at+9.01.01+am

Harvesting ArcGIS Online Data and Maps Metadata

This short article describes a process where Python was used to harvest metadata from a list of identified ArcGIS Online maps and the maps’ data services. The data were logged to MySQL (with pymysql); a PHP web search and discovery page was created.  The process allows for keyword searching in titles and descriptions of maps and data layers.

Why harvest metadata?

This approach was used as our collection of 150 maps is housed in several AGO organizations with data services spread across even more.  These data and maps are designed for student use and vetted, making school curriculum authors interested in search and discovery of “good” data.  Essentially, we have a target population that is keenly interested in a subset of scattered data and maps.

Who might use this approach?

This article may be of interest to developers needing to create a search solution across a specific list of maps and constituent data services.

The Approach

Using the ArcGIS Python API 1.5.2 and a prebuilt list of mapIds, a script was built that iterates over the list, logging titles and descriptions for the maps.

with open('data/maps.csv') as f:
reader = csv.reader(f)
for map in reader:
result = gis.content.get(map[0])
web_map_obj = WebMap(result)
web_map_obj
web_map_obj.layers

Using the layers attribute now available for the map object, we then loop over all the layers in the map (above), also logging titles and descriptions.

We then log the data to a MySQL table (with a custom function), like:

dbwriter(
  objectType='Webmap',
  mapId=result.itemid,
  objectName=result.title,
  url='http://www.maps.arcgis.com/home/webmap/viewer.html?webmap=' + result.itemid,
  description=ssnippet
  )

Then loop over each layer and log to the table, slightly changing parameters as necessary.

You can access the script at GitHub below. The Python code, a sample CSV input file, and a sample SQL script for generating the MySQL table are included.

github  https://github.com/trbaker/mapHarvest

Python and Cron to Refine SSO Account Creation

Single sign-on offers great benefits for schools using ArcGIS Online (AGO) and other SAAS products. Today, one limit of SSO is that the SAML implementation offers an efficient but subset of features one might need in order to effectively manage a large user base.

For example, I was approached by a large K12 educational entity (greater than 10k) that wanted to implement SSO for user account creation and authentication.  However, they also wanted teachers to have publisher roles and students to have user roles.  The current enterprise login system in ArcGIS Online casts everyone into one role. Enter Python!

Begin by setting up your workstation or server, ideally one with the capacity to schedule tasks (Cron, Windows Task Scheduler, OSX launchd or iCal). When the script is ready, you will need to use a scheduler to run the task repeatedly. This is in part because there is no way to trigger the script on an SSO event in AGO (at least today).

The script is built on a Python stack including: ArcGIS Online API for Python 1.5.1, Python 3.6, and Anaconda 4.4.  You can use Jupyter to create the script but is unnecessary for regular or scheduled execution. See API and stack installation details.

The script retrieves all the user accounts in the target organization and then filters out the unnecessary users. In this script, three filters are applied:

1.    Ignore accounts without the autogenerated appended organization name (case-sensitive). Check for the appended org subdomain (applied automatically by SSO and other auto-generated username mechanisms). In this case, the pattern we want looks like: tbaker_SchoolDistrictX, where “SchoolDistrictX” is the organization’s subdomain.  This doesn’t guarantee the account was created via SSO but can rule out manually created accounts, depending on username policy in the organization.

2.    Ignore user accounts more than one week old.  This also prevents manual account edits (to accounts over one week old) from being reverted accidentally by the script.

3.    Use a regular expressionto sort the usernames into teacher versus student.  In this example, student usernames all contained six consecutive digits where teacher accounts did not.  Student accounts were filtered out since all new SSO accounts were created at the lower student level (user).  When a username without six consecutive digits was found, it was passed through.

Get the script at GitHub. >>

Once a user passes all the filters, the “update_role” function is used to upgrade to built-in role, “org_publisher”. This script would need modification if using with a custom role.

Any of the filters can be changed or removed based on organizational need. The regular expression in filter three will most certainly have to be modified by everyone.  Many will want to run the script daily or even hourly, requiring a commensurate time change in filter two.

Of course, be sure to comment out the “update_role” function until you have completely tested the script in your environment. I generally inset a print statement in place for feedback during testing.   Like most scripts, there’s no “Undo” button.

For initial deployment, I saved the script to a “.py” file and scheduled it to run from a laptop on the corner of my desk however very shortly I’ll deploy to a dedicated Amazon Lightsail Ubuntu server.  Enjoy!

—–

Read more: