Albin Larsson: Blog

Culture, Climate, and Code

Enabling CORS in SOCH with a Proxy on Google App Engine

24th November 2015

Update SOCH now supports CORS.

One of my favorite APIs is SOCH also known as K-Samsök, I love it because of it’s content. In the time of writing it contains 6117658 heritage items.

The API has several disadvantages too the biggest one is probably that it’s “Linked Data First”(I might just invented that expression), the second one is licensing(it’s messy and got some conflicts), the third one is that it does not support CORS(Cross-origin resource sharing) so you cant use it directly from a client.

Therefor I created a SOCH proxy for Google App Engine that enables CORS and supports all features of the API such as XSLT stylesheets and JSON-LD(through custom request header).

Thanks to Google App Engine it’s totally free to host and run. The source is available in my ksamsok-proxy-gae repository at Github.

Notes

XSLT stylesheets

The XSLT stylesheets is served directly from within this app. The stylesheets used are the ones from the latest open sourced version of K-Samsök. That version was released in 2013 so the ones this application serves might differ from the ones on K-Samsöks servers. If you run into trouble please open a issue and I will look into it.

JSON-LD

K-Samsök supports JSON-LD through a custom HTTP header(Accept format), this is also available through this proxy. But in some rare cases it does not work, because all web browsers and web servers does not support HTTP headers with spacing. This issue exists with regular use of the K-Samsök API too.

Winning the Nordic Open Data Challenge

23rd November 2015

I did attend Slush a week ago, I had a great time, really liked some of the talks(The best one were by Christopher Fabian UNICEF) meet a lot of people that is both doing interesting things and though what I do is great.

Photo from Christopher Fabian(UNICEF)s talk

The reason I got to Slush is as frequent readers of this blog know were the Nordic Open Data Challenge that the Biocaching project participated in(previous posts about Biocaching, 1, 2, 3).

Biocaching ended up being one of the three winning projects together width Humans4Oceans and SpaceInvaders. I’m a fan of all project at the same time as I believe that all projects has major issues that has to be overcome.

Photo of Alice, Peter and me Photo of Alice, Peter and me.

Biocaching Continues

25th October 2015

photo from a press conference in Port Louis

Twice before has I has been writing about Biocaching, first when we won Hack4no then again when we relaunched the idea under the new name with a new website.

Since then we got to the Citizen Science Challenge hosted by UNEP/Eye on Earth Alliance. Bjørn and Alice went to Abu Dhabi and got a lot of great feedback.

We got featured on UNEPs website.

Biocaching was last week presented at a press conference in Port Louis, Mauritius, by the country’s Minister of Environment(Press letter).

We were featured on the webinar “Citizen Science & Citizens’ Observatories”, hosted by Bente Lilja Bye(link).

Jacqueline McGlade Chief Scientist UNEP also had some thoughts about our project.

That’s somethings that happened recently, up next is Slush in Helsinki. We would love to meet and talk open data and crowdsourcing!

OpenStreetMap CLC06 Import Cleanup Part Two

6th September 2015

In January I proposed cleaning up a lot of imported CLC06 data manually to preserve the OSM ecosystem. Mappers had a hard time working with the large multipolygons and tools such as Tilemill/Mapbox Studio/iD all had rendering issues. Because of those issues the multipolygons become even more broken by mappers trying to handle them.

The two main multipolygons had areas about 80x120KM.

Overpass image over the multipolygons.

Removed

Today I removed the last major imported multipolygon(the east one). The first one I removed all way back in January, then I spend about a month mapping all the forest back from scratch. I moved on to the next multipolygon and ran into ±5.0 × 10−324 to ±1.7 × 10308 errors and moved on to do other things.

I learned some more powerful Overpass queries and earlier this Sunday morning I pulled the trigger on the second multipolygon, done in less then five minutes. I brutally smashed some “smaller” low quality CLC06 multipolygons into nothing.

Imported CLC06 multipolygons, currently:

overpass screenshot

Ahead

OpenStreetMap lost tons of data as a result of my work, still I believe that the lose of bad data allow OSM to faster gain high quality data. I mentioned earlier I mapped most of the forest from the first multipolygon back in smaller pieces and with higher quality. Mapping it all back will take time but the forest data will end up being usable.

All the CLC06 relations and ways:

overpass screenshot

As shown in the image above all easy edible such as small not broken multipolygons and areas is still in there, progressive updates of those objects will eventually make the underlying low quality import obsolete.

Older PostsNewer Posts