Monday, March 17, 2014

Combining BAG and AHN2 Point cloud data.

Data sources

In previous posts , I demonstrated how the BAG data and AHN2 rasters can be accessed by FME.

What I would like to demonstrate now, is how to fetch the AHN2 point cloud data and combine it with the BAG buildings.
This in effect will show how easy it is to add the elevation values to the BAG buildings.
The additional elevation information makes it possible for example to classify the buildings roof type (flat vs.slanting) and transform the 2D BAG buildings into 3D objects.


The BAG data was accessed for a small area of interest (AOI), this area contains 2D footprints of buildings.

AHN2 Point Cloud

Much like the AHN2 raster data, the on-line point cloud data can be easily accessed via FME. One of the differences in this workspace is that the FeatureReader transformer is used instead of the RasterReader.


Combining the data

For spatially relating features there are a few options, you can go for the 'old fashion' method by clipping the point cloud data for each building or by using the SpatialRelator, but my preferred way is to use the SpatialFilter.
Why? well mainly due to performance issues and the fact that no extra transformers are necessary as is the case with the SpatialRelator.
So after relating the features, the elevation information is in fact added to the buildings. (whether it represents the correct height is another matter, which is not addressed here).
There are lies, damned lies and statistics - Mark Twain.

So how to go about adding more than just the elevation information?
Well after spatially relating the point cloud to each building, a number of statistics can be computed with the help of the StatisticsCalculator.

This added information can be used for initial classification purposes, for example buildings with a low range value can be classified to have flat roofs.

The Workspace.

After creating the AOI (Creator) the BAG data is fetched off the Internet in the BAG custom transformer.
For more info on how to do that see this previous post.
The point cloud are, in much the same way, fetched from the web, unfortunately it is not possible to grab only the point cloud features of the AOI. 

Point cloud AHN2 custom transformer.

Once all data is read, combining it is done with the SpatialFilter and the StatisticsCalculator finishes the job by adding additional elevation statistics (don't forget the Group By setting)
Note that I have opted for the summary port of the StatisticsCalculator, since I am no longer interested in the points themselves. (tip for good practice, drop anything you don't need ASP!!)

To be able to share some results I have created a 3D pdf  that contains a building footprint (select it to view elevation statistics), point cloud data and additionally derived features (TIN, contours) (tip: download it, and open with Adobe, the web brouwser cannot display it correctly)
Notice the spikes in the point cloud data and derived features, some of them can be attributed to the roof material, others to windows and lastly to vegetation (trees), how do I know? well here is a hint (switch to satellite and head north)

Tuesday, March 11, 2014

Fetching all of the AHN2 raster data with FME.


Recently the AHN2 dataset was released to the public via the Dutch SDI (PDOK)
This was a perfect opportunity to set FME to work on this newly available resource.
This time the rasters and point cloud data was made available via an Atom feed XML.
This means that you can download the 5 meter GeoTIFF rasters and LAZ point cloud data on a national scale. 
Well I had to try that...just for a starters, to see how long it would take me to build the workspace and download the raster data.


Building the workspace took about 5 minutes, the tricky part was to make sure you are grabbing the correct part of the XML, after that it's a walk in the park, letting FME get the data off the Internet. 
The workspace itself (4 transformers) starts with a Creator to jump start the HTTPFeatcher, that grabs the XML document.(AtomFeed)
After finding the correct XML tag and extracting the information (URL) with an XMLFragmenter the string is passed to retrieve the rasters (RasterReader).


The raster data took a while longer to download but that is mainly due to the fact that the rasters need to be fetched off the Internet.

Another aspect that I wanted to test was the ability to save the data, so I made use of the (very much welcomed back!!) capability to save the data directly from the Data Inspector.
The FFS is the native FME format, which supports vectors and rasters. The ability to compress the data was applied to see how compact such a big dataset can be made.
After about 15 minutes the Data Inspector happily reported finishing the job of saving the data, I was pleasantly surprised to find out that only about 6 gigabyte of space was necessary.

Results and development

 After downloading the data, I wanted to make the workspace easy use to use and flexible for any spatial selection (My guess is that not many people are interested in the all dataset, but mainly the data in a specific area of interest)

AHN2 raster data.
So what can be better than combining additional services containing rasters bounding boxes and boundaries.
That way you can select your area of interest be it a municipality border or a selected area of interest.

The Dutch SDI provides the possibility to download data in much the same way, but once you have the data in your FME workbench, you can do much more that just download.

This is actually where it starts to get interesting, the opportunities that the height data provides are numerous.

More about downloading the point cloud data and combining the different SDI services on the next post.

By the way did I mention the the rasters are read from a zip file? Well no because that is so 2013 ;) 

Saturday, March 1, 2014

Fetching all BAG WFS features

Fetch boy fetch!







The idea for this post originates from a tweet by a FME user on a blog post in which information is given about accessing the Dutch SDI (PDOK) OGC webservices. For original post (in dutch) see brentjesgeoict.


In the post a number of handy tips are given concerning fetching all features residing in the national registration for addresses and buildings service (BAG)
I must admit that until now I have used OGC services strictly as input data source without actually realizing that there might be a limitation. After getting curious about it I started finding out more about the possibilities of smart usage of OGC services and the ability to make the use easy via FME.


I am not going to give you a detailed overview about OGC, for that use the Internet, but I will mention the fact that the OGC standards are unique in the GeoSpatial sector for being widely accepted as created standards (in contrary to the de facto standards e.i. shapefile)

Making it easy

The recommended way to overcome the service restrictions (for this case there is a restriction on the service that results in a maximum of 15000 features returned) is to make use of the count and startIndex parameters of the GetFeature request.(WFS 2.0.0)
To make sure that all of the features are returned, an initial query ("resulttype=hits") returns the total number of features served.
With that a simple calculation can be done that ensures the correct number of requests are always sent.
The use of parameters in the workbench makes it easy to adapt the GetFeature request syntax to any available feature type.
Allowing the area of interest to be selected (municipality, province or national) at runtime and the calculated requests make it all care free and really easy to use.
FME Workbench


To make the results accessible I have created a Google map.
With the help of the filter option simple questions such as: "all the buildings built last year?"* can be directly answered and visualized.

* Results are based on a temporary service, and may not be updated or complete.
Interested in the workspace, give me a call