wo|open

Optional Module Geoprocessing with Python

After successfully finishing the first optional module “Developing applications with OSM”, I decided to focus on scripting and extend my Python skills.
I already gathered some Python skills during my Bachelor Thesis, where I got to the limits of the Model Builder and had to build a Python script. Also I use Python quite often in the Field Calculator of ArcMap. Lately I did some scripting with Python and the GDAL bindings since Ecotrust, the place I currently intern, prefers open source for their projects.
Some samples of my work that include Python with GDAL can be found here.

The module was set up like previous modules with 14 lectures and six assignments.

In assignment one we had to write a stand-alone script that calculates the perimeter of a circle and print it out. It should also include some try-except statement. For pi we used the math module.

The second assignment was a writing assignment. We had to answer questions about ArcGIS workspace, environment settings, hierarchy levels, etc. We had to research the questions form the ArcGIS Resource Website.

The third assignment was again a practical one. We had to use ArcPy to produce a script that creates a buffer around rivers and dissolves the created features. The script should then be extended so it automatically creates three buffers with a preset distance.

In assignment four we had to create a tool for the ArcGIS toolbox. The tool itself should add fields to a shapefile. Also we had to produce a help file and try to make the tool as user-friendly as possible.

Assignment five was again a pure scripting assignment with ArcPy. We had to create a script that produces a new point feature class based on the user’s specified offset.

The last assignment was to create a script for the field calculator. The code should test the field value of three fields and return it.
L11_A06

Overall, I would recommend the module if you are interested in learning geoprocessing with ArcPy. It gave a good introduction to ArcPy and how to create your own scripts. However, I think the title is again a little misleading since it exclusively used ArcPy. A better title would be “Geoprocessing with ArcPy”. I would like to have seen some other Python GIS bindings like GDAL or Shapely in the module. None of them were mentioned even though they have great adventages to ArcPy (e.g. speed, more adaptive, etc.). ArcPy however is definitly more user-friendly and therefore it makes sense to focus on it in an introductory class.

Module Spatial Analysis

The latest module “Spatial Analysis” is with the task of actually making maps, the core task of any GIS professional. The module was very practical and also very ESRI heavy.

As always the module started out with a general introduction. Dr. Snow’s map of the 1854 London cholera outbreak in conjunction with water sources was used to introduce us to spatial analysis. He mapped the cholera outbreaks in connection with wells in the area.

The module covered very broadly the typical spatial analysis methods and it’s theoretical backgrounds. The topics covered:

  • Map algebra
  • Spatial selection and aggregation
  • Forms and patterns
  • Distance Analysis
  • Spreading and diffusions
  • Network Analysis
  • Interpolation
  • Surface description and Analysis
  • Overlay
  • Classification
  • Graphical modeling
  • Geosimulation
  • Fuzzy Modeling

Since all of these topics are covered in any GIS introduction book and most GIS professionals are very familiar with it anyway, I will not go into depth of the content of the lectures. Also the assignments are typical spatial analysis use cases and are already well document in the ArcGIS Resource Center.

I think the module is something every GIS professional should be confronted with in his/her career. We perform these kind of analysis on a daily basis. I think it really makes a difference in interpreting the results, if we actually understand the theoretical background of the tools we use. So I was very glad that in addition to the many ArcGIS assignments we had to solve, a big junk of theory was explained in the lectures. I intend to go regular back to the lectures and review the theory behind the practical tasks. The more we understand what we do, the better we can interpret the results. ArcGIS performs quite some magic geoprocessing, which we very often don’t understand anymore, and we have to make sense of the produced data. By going back to the theory we interpret the results more efficient and with better quality. This module is a great resource and starting point for that.

Module OpenGIS and Distributed GI Infrastructures

The module OpenGIS and Distributed GI Infrastructures was clearly dominated by the OGC Standards and the surrounding software of it. Thinks I already heard of a couple of times, especially during the web mapping summer school. So I was especially interested to get a in depth understanding of the standards and everything that is impacted by it. Like always the module started out with covering the status quo. Following the concept of GI infrastructures was very well explained. It stayed very theoretical and dry with INSPIRE, the approach of the European Union to establish an EU GI infrastructure. Towards the end it got a lot more practical with topics like: WMS, GML, WPS, WFS etc.

The tasks we had to solve were very applied and I could use my previous knowledge and experience quite a bit. Our first task was to sketch out a scenario were various geo data are exchanged within a company, but also with external partners. We had to explain what kind a problems could arise and how to solve them. The big topic was data interoperability. I used DHI Wasy’s GeoFES as an example. Various data come from external sources (city governments, Fire Departments, etc. ) and have to be integrated with the GeoFES GI architecture. The graphic below demonstrates the data flow.
Screen Shot 2014-03-14 at 9.52.11

The second assignment was a little less practical. Our task was to compare the GI infrastructures of the countries Austria, Switzerland, and Germany. They are all organized under the big INSPIRE umbrella and have to implement the principles in their national GI infrastructures.

The third assignment was a lot more practical again and I learned a lot. We had to choose a random protected area and transform as a first step the datafile (e.g. shapefile) to a GML, which is fairly easy possible with QGIS like shown below.
Screen Shot 2014-03-14 at 9.57.38

Afterwards we had to edit the GML file in a text editor and modify it to meet the INSPIRE specifications for a protected site. In the last step we had to deploy it as a WFS with GeoServer.

In assignment four we had to answer several questions on how to query a Web Database (topics like WFS, OpenGIS Filter Encoding Specification, OpenGIS Catalogue Services Specification).

The last assignment closed the circle again. We had to create a full concept for our described problem of interoperability in assignment one. The concept needed to include a current situation description, solution concept, project plan, economic feasibility analysis, and an elevator-pitch were we had to convince our management for the concept.

Module Geodatabase Management

I was highly interested in the latest module Geodatabase Management. Databases are something we use all the time, even though we aren’t really aware of it. Facebook’s and Google’s data infrastructures are based on some sort of database. Also spatial data can be stored in a database, which has the big advantage of making everything a lot faster. When I think of databases and GIS, the first think that comes to my mind is the PostgreSQL extension PostGIS. Though for the purpose of this module we used SQLite and SpatiaLite, because is doesn’t require any set up. The general database commands are fairly universal and program specific independent.

The module started out with general fundamentals of databases: historical development, reasons for using databases, the diversity of various models in order to model the real world. The first assignment was fairly simple. We had to define database related vocabulary like what an entity type is.

The class moved on with the topic architecture of database systems. As an assignment we had to create a logical model with Bachman notation for a sensor network (see graphic below) and transform it later on to a relational model.
image1

The two following assignments covered the topic normalizing extensively. After these very theoretical topics and assignment we got to the the core part of the module which was SQL. We had to perform plenty of SQL statements via SQLite on a given dataset. Towards the end we came to the specifics of GEOdatabases, advantages of indexing, spatial-sql, etc.

After performing plenty of SQL statements, normalizing all kinds of things, and creating a lot of models, I really have the feeling I know a lot more about database management. Especially I see now a lot more the advantages of them, which wasn’t so clear to me before starting the module.

Module Project Management

The latest module project management was already worrying me when I decided to sign up for the study program itself. During my bachelors degree I already had not so good experiences with similar modules. To me they always seem a lot of talking around nothing. I never liked it. Nevertheless I tried to stay positive and was hoping to final get something out of topics like that. I know project management is something I will be confronted on a daily level once I am out in the “real” job world. So was hoping to finally get into the topic and see the reason why it is in almost every field a topic.

The module was set up in 15 lectures and covered topics around project management tools, organizational skills, legal aspects, etc.
In the end we got one big assignment to solve. The assignment was set up like a role play and include three roles, which were also our three tasks:

  • Role 1: As a active EU-citizen we were supposed to create a draft of a possible climate Online-GIS based on the recent IPCC report. It should enable the user to identify causes of climate change and give the user opportunities to act against the causes. The draft of the GIS was then to be submitted to a local politician.
  • Role 2: As a EU politician we had the task to take the draft of the citizen and transform it to a Logical Framework Matrix
  • Role 3: As a project lead of a project management office we were supposed to create a project handbook for a feasibility analysis of that possible climate GIS.

The first two roles were relatively easy to solve. The given freedom in details invited for great creativity, but also caused quite a bit confusion in terms of how far shall we go, or what exactly is required. For the third role we were required to use default templates of next level consulting. It required a lot of copy and paste. In the end a heavy project handbook resulted but the learning curve was pretty low.

I was hoping to learn basic principles of project management but all we did in the end were using a default template from a consulting company and edited it little bit with some random ideas.

Optional Module Developing Applications with OSM

I finished my first optional module “Developing Applications with OSM” and I am quite satisfied with what I learned. Though the title of the module was a little bit misleading.  We did not really develop any applications, we rather used applications to process OSM data.

The structure of the module was roughly based on the OSM book by Jonathan Bennett.  The end of the module covered a little bit of Open Layers. Here the content was based on the book Open Layers 2.10 by Erik Hazzard.

The module started out with a couple articles about Web 2.0, crowdsourcing, neogeography, etc. We had to write a short essay about this new phenomenon.

All the software we used we accessed via a virtual Ubuntu machine. We received from UNIGIS a DVD with all necessary programs. Everything was preinstalled and really easy to use.

For the big first module project we had to merge new data (hiking paths we received from TraffiCon) with existing data via JOSM. It was quite difficult for me because I had not yet understood the principle of OSM completely. Yet I figured it out in the end.

Also we had to render downloaded OSM data with Maperitive and Osmarender.

Most interesting for me was the use of Osmosis tool. It is a command line application for processing OSM data. We filtered and extracted different data via the tool.

The last assignment of the module was an essay about future trends of Volunteered Geographic Information Systems with a focus on citizens as sensors.

Optional Module Developing Applications with OSM – KICKOFF

I have finished my first three mandatory modules so I am able to select my first optional module. I chose the module “Developing Applications with OSM” taught by Enrico Steiger (working for Trafficon). OSM standing for OpenStreetMap is good alternative to Google Maps with the goal to create a free editable map of the world. Just this spring the application foursquare announced that it switches from Google Maps to OSM.

I am quite thrilled about the modul since I like to use OSM data in my master thesis. Also we will use a Linux Ubuntu Virtual Machine in the module. I looking forward to use this as I just have used this in the summer school and had a good first experience with it.

Here is a nice introduction video to OSM showing edits to the OSM during one year:

Girona Summer School 2012: Open Web Map Services and Web Mapping

Right after the GI_Forum I went to Girona, Spain. In Girona I participated in the Open Web Map Services and Web Mapping Summer School organized by SIGTE (GIS & Remote Sensing Centre – University of Girona). The summer school was held in their historical city campus.There were about 30 participants from lots of different countries (Mexico, England, Austria, Hungary, USA, etc.). I made great connections to various people in the world of GIS.

DAY 1:  Welcome and Introduction. Working with PostGIS and Introduction to Spatial Databases and SQL (Lluís Vicens + ToniI Hernández – SIGTE)

On the morning of the first we got an introduction to Free and Open Source Software. Most of that I have heard in previous seminars. We continued with introductions to Spatial Databases, SQL, PostgreSQL, PostGIS, and pgAdmin as the administrative user interface). The afternoon was dominated by lots of SQL queries.

After an exhausting first day SIGTE organized a guided tour for us around old town Girona.

DAY 2: Introduction to Open Web Services (Jeremy Morley – University of Nottingham)

Jeremy Morley started out with explaining us the necessity of standards. After that we went on with various XML exercises. In the afternoon we did exercises with Open Web Service: GetCapabilities, Consuming open web services in a desktop GIS, Geodetic and map projection systems.

DAY 3: Working with GeoServer. Creation of Open Web Services (OWS) (Juan Marin – OpenGeo)

Juan Marin from OpenGeo gaves us an introduction to their GeoServer. Topics include data loading, basic styling, OGC standards, web interface, and map viewing with OpenLayers and Google Earth.

DAY 4 + 5: Working with OpenLayers and GeoExt to design and create rich web mapping applications (Juan Marin – OpenGeo)

On Thursday and Friday Juan Marin (also from OpenGeo) we created complete web mapping applications.

We learned how to create browser-based map applications and display data with OpenLayers. This required lots of JavaScript syntax and using the command line. Both of it was completely new for me but not as difficult as I expected it.

It was great that the instructors came directly from OpenGeo. They really could answer every question, and you could feel they were the guys that programmed that stuff.

All the exercises we did on day 4 and 5 can be found here (with very detailed instructions).

 

All the presentations from the lecturers can be downloaded here.

I really can recommend the summer school: great organization and infrastructure, great lecturers, and great people to connect with! If they offer one next year, I will definitely go.

GI_Forum 2012

Last week I attended the GI_Forum in Salzburg (Austria). The GI_Forum is an English-speaking symposium annually organized by the Centre for GeoInformatics (Z_GIS) at Salzburg University.

The GI_Forum covered the following topics

  • Learning with Geoinformation
  • Location Based Services
  • Analysis and Risk Management
  • Spatial Analysis
  • Spatial Information and Society
  • GI Technology
  • Visualization

 

Can we make an App for that?

Besides outstanding Keynotes I found the Education Panel “Learning with Geoinformation” quite interesting. Especially stood out for me the presentation by T. Schauppenlehner (University of Natural
Resources and Life Sciences, Austria) and his student with the topic: Can we make an App for that? Integration of school students within a research-education cooperation.

Students from a secondary technical school (age 17-18) worked on small research topics and created geo spatial web applications (apps) as final products. By creating apps the whole research process was more transparent and ended up in a functional digital product with an additional value for the project, the students and the schools. The students contributed with their technical skills and learned how to answer specific questions by selecting suitable methods as well as collecting and analysing data.

I was fascinated how deep you could go into GIScience with high school students and producing such creat applications. The result of one of the projects can be seen here. Also the full paper can be downloaded here.

 

My first publication

Furthermore the GI_Forum gave me the opportunity to present my bachelor thesis. The topic of my thesis is “ArcGIS Tool ‘Biomass Cost Analyst’ Enhances Biomass Quantification and Forest Management in Fairbanks, Alaska (USA)”. Also after a long double blinded review process the paper was published in GI-Forum 2012: Geovisualization, Society and Learning; Jekel, T.; Car, A.; Strobl, J.; Griesebner, G., Eds.; Wichmann Verlag, Heidelberg: Salzburg, Austria, 2012. It is my first publication. The paper can be downloaded here. The following abstract summarises the paper:

Fairbanks Forestry needs to provide a recently opened pellet mill with 405 hectares of woody biomass annually, at a price of 43 USD or less per ton. Is Fairbanks Forestry able to provide the required biomass? The developed ArcGIS tool “Biomass Cost Analyst” gives answers. A price per ton of woody biomass was calculated for each timber stand. This was done by using the Fairbanks forest inventory database, by determining several cost parameters, and then processing this data with a tool created for use in ArcGIS. Development of the tool was carried out by creating a model with the ModelBuilder in ArcGIS Desktop 10. The model is composed of a main model and submodel. The submodel is necessary to facilitate the use of iteration. In addition, a python script tool was created for the submodel. At the 43 USD per ton limit, there are approximately 24,000 hectares, with more than 2.4 million tons of woody biomass, available. The majority of the acreage is comprised of birch and aspen stands, with the strata Birch Closed 26%, and Aspen Closed 25%. Currently, the pellet mill mainly uses white spruce, of which there is only a small amount available.

Module Data Sources and Data Acquisition

The third module got to the source of GIS. The place where every GIS analysis starts out: “Data Sources and Data Acquisition”.

After a introduction we got to Geodesy, one of the core disciplines of GIS. We learned about primary data acquisition by simply measuring.

Followed by that, we went back to the digital world of photogrammetry and laser scanning. Here we had to solve our first assignment. We had to georeference analog maps with ArcGIS, document the transformation parameters, determine point coordinates, and discuss possible errors. Georeferencing wasn’t complete new for me. During my internship for DNR Alaska, I had to georeference seed maps. I really like this kind of work, because it so clearly connects the analog and digital worlds.

On we went into the world of Global Navigation Systems which is not only GPS (USA), but also GLONASS (Russia), Galileo (Europe), and COMPASS (China).

After Navigation Systems we came to the broad GIS topic of remote sensing. A topic I was not very familiar with yet. So I was curious to learn more about it. As an assignment we had to research in the USGS’s EarthExplorer. USGS is the official distributor of the LANDSAT 7 data. We had to evaluate and interpret LANDSAT scenery of Mexico City and of our home town. I was surprised that all the data is available for free and fairly easy accessible. It is really interesting to check out the different images and make sense of vegetation cover, cloud coverage, etc. Below is a LANDSAT 7 image of Berlin with almost no cloud coverage.
Screen Shot 2013-11-29 at 11.51.24

Right after that we came to digitizing, which is considered as secondary data acquisition versus primary data acquisition by measuring. As an assignment we had to design a file geodatabase and digitize a particular area’s transportation system and it’s land use coverage. I digitized the village I grew up, since I am very familiar with the area and it is relatively lucid. Below you can see the digitized area.
Screen Shot 2013-11-29 at 11.58.20

The second part of primary data acquisition assignment was to vectorize raster data. In the assignment we used the ArcScan extension to vectorize contour lines. Also we had to discuss various problems that occur by vectorizing.

The end of the module mainly covered topics of managing data, metadata, transforming data, etc. We had perform a relatively complex transformation with the FME software. We had to combine data of various sources and formats into one database and process (buffer, reproject, etc.) the data in between. Below a screenshot of the FME workflow.
Screen Shot 2013-11-29 at 12.05.24

I enjoyed working through the module since it covered such a broad range of topics. Some of the covered topics I have done before (e.g. ArcScan), but many new fields (e.g. FME, LANDSAT) were covered by the module.