Tech Enabled Field Studies, ed 2

We created this book for educators who want to do research with learners –typically IMG_2032
classroom teachers working with their students in Earth Systems or Environmental Science, Geography, or History.  The book would also be useful to those running outdoor education or field research programs for students of all ages; however, as the title suggests, we do focus on tech-enabled methods, tools, and analysis.  Most of the content assumes that the research will be conducted outside or “in the field”. We believe it will provide powerful justification to include these projects in your classes. we believe this volume would make a great addition to any reference library on field research techniques.

Chapters in edition two include:

  • What are field studies and why use them?
  • Designing a field study
  • Data, Data, Data
  • Field Instruments
  • ArcGIS Online and Field Studies
  • Geotagging Images for Field Studies
  • Survey123 Web
  • Creating an Editable Feature Service in ArcGIS Online
  • Editable Feature Services and ArcGIS Desktop
  • Mapping and Data Analysis in ArcGIS for Desktop

To purchase the book, visit Amazon or GISetc.

 

GeoInquiries Map and Data Search

The experimental GeoInquiry search tool allows an instructional designer or advanced GeoInquiry user to search GeoInquiry maps and data for keywords (and optionally add that data to their own map).  The tool includes all Esri produced geoinquiries but also state GeoInquiries content.

For example, a search for “climate” will search the metadata (including title and description) of about 150 maps and nearly 1,000 data services while ignoring all non-GeoInquiry content in ArcGIS Online.

The database behind the tool is updated twice monthly to reflect the latest changes in map and data service metadata. Explore the tool at http://edgis.org/geoinquirysearch

screen+shot+2019-01-22+at+9.01.01+am

Harvesting ArcGIS Online Data and Maps Metadata

This short article describes a process where Python was used to harvest metadata from a list of identified ArcGIS Online maps and the maps’ data services. The data were logged to MySQL (with pymysql); a PHP web search and discovery page was created.  The process allows for keyword searching in titles and descriptions of maps and data layers.

Why harvest metadata?

This approach was used as our collection of 150 maps is housed in several AGO organizations with data services spread across even more.  These data and maps are designed for student use and vetted, making school curriculum authors interested in search and discovery of “good” data.  Essentially, we have a target population that is keenly interested in a subset of scattered data and maps.

Who might use this approach?

This article may be of interest to developers needing to create a search solution across a specific list of maps and constituent data services.

The Approach

Using the ArcGIS Python API 1.5.2 and a prebuilt list of mapIds, a script was built that iterates over the list, logging titles and descriptions for the maps.

with open('data/maps.csv') as f:
reader = csv.reader(f)
for map in reader:
result = gis.content.get(map[0])
web_map_obj = WebMap(result)
web_map_obj
web_map_obj.layers

Using the layers attribute now available for the map object, we then loop over all the layers in the map (above), also logging titles and descriptions.

We then log the data to a MySQL table (with a custom function), like:

dbwriter(
  objectType='Webmap',
  mapId=result.itemid,
  objectName=result.title,
  url='http://www.maps.arcgis.com/home/webmap/viewer.html?webmap=' + result.itemid,
  description=ssnippet
  )

Then loop over each layer and log to the table, slightly changing parameters as necessary.

You can access the script at GitHub below. The Python code, a sample CSV input file, and a sample SQL script for generating the MySQL table are included.

github  https://github.com/trbaker/mapHarvest

Python and Cron to Refine SSO Account Creation

Single sign-on offers great benefits for schools using ArcGIS Online (AGO) and other SAAS products. Today, one limit of SSO is that the SAML implementation offers an efficient but subset of features one might need in order to effectively manage a large user base.

For example, I was approached by a large K12 educational entity (greater than 10k) that wanted to implement SSO for user account creation and authentication.  However, they also wanted teachers to have publisher roles and students to have user roles.  The current enterprise login system in ArcGIS Online casts everyone into one role. Enter Python!

Begin by setting up your workstation or server, ideally one with the capacity to schedule tasks (Cron, Windows Task Scheduler, OSX launchd or iCal). When the script is ready, you will need to use a scheduler to run the task repeatedly. This is in part because there is no way to trigger the script on an SSO event in AGO (at least today).

The script is built on a Python stack including: ArcGIS Online API for Python 1.5.1, Python 3.6, and Anaconda 4.4.  You can use Jupyter to create the script but is unnecessary for regular or scheduled execution. See API and stack installation details.

The script retrieves all the user accounts in the target organization and then filters out the unnecessary users. In this script, three filters are applied:

1.    Ignore accounts without the autogenerated appended organization name (case-sensitive). Check for the appended org subdomain (applied automatically by SSO and other auto-generated username mechanisms). In this case, the pattern we want looks like: tbaker_SchoolDistrictX, where “SchoolDistrictX” is the organization’s subdomain.  This doesn’t guarantee the account was created via SSO but can rule out manually created accounts, depending on username policy in the organization.

2.    Ignore user accounts more than one week old.  This also prevents manual account edits (to accounts over one week old) from being reverted accidentally by the script.

3.    Use a regular expressionto sort the usernames into teacher versus student.  In this example, student usernames all contained six consecutive digits where teacher accounts did not.  Student accounts were filtered out since all new SSO accounts were created at the lower student level (user).  When a username without six consecutive digits was found, it was passed through.

Get the script at GitHub. >>

Once a user passes all the filters, the “update_role” function is used to upgrade to built-in role, “org_publisher”. This script would need modification if using with a custom role.

Any of the filters can be changed or removed based on organizational need. The regular expression in filter three will most certainly have to be modified by everyone.  Many will want to run the script daily or even hourly, requiring a commensurate time change in filter two.

Of course, be sure to comment out the “update_role” function until you have completely tested the script in your environment. I generally inset a print statement in place for feedback during testing.   Like most scripts, there’s no “Undo” button.

For initial deployment, I saved the script to a “.py” file and scheduled it to run from a laptop on the corner of my desk however very shortly I’ll deploy to a dedicated Amazon Lightsail Ubuntu server.  Enjoy!

—–

Read more: