Info on GIS development

We are in the process of changing the GIS module in terms of how the geographical information is persisted and presented.

In the snapshot version we now store the coordinates in JSON format directly in the database on the OrganisationUnit.coordinates property. This gives us a lot more flexibility in the way maps are presented.

Previously maps had to be registered explicitly either in the form of GeoJson files or Shapefiles. Then the user had to select a map together with indicator and period. Now the user can select an orgunit from a tree and the children of that orgunit at the level below will be displayed in the map.

In large countries in India it is impossible to display a single map at the lower levels (eg. for thousands of districts) as the map will be too heavy and slow to load. Registering and managing maps for every e.g. provinces will also be too cumbersome. With the current solution there is no more work of registering and selecting maps - only the one time job of importing geographical data/coordinates into the database.

Importing is a 4 step process:

1. Convert your shapefiles (or whatever format you have) into GML.

The recommended tool is FWTools, http://fwtools.maptools.org/ . The command for converting shapefiles into GML is

ogr2ogr -F GML output.gml input.shp

(make sure you stand inside the folder containing the shape files)

Check available formats with the command ogr2ogr

2. Make sure the XML element inside the GML file which contains the orgunit name is called exactly ogr:Name (use search and replace if not), e.g.

ogr:NameBadjia</ogr:Name>

3. Import the GML file into DHIS through the regular import interface (no need to zip it)

4. In the GIS module, make sure the Administrator - Map Source setting is set to DHIS database.

In the Polygon Layer screen, you can then select the orgunit from the tree which appears by clicking on the Parent orgunit field.

Caveat: Shapefiles tend to have duplicate orgunit names, at least at the lower levels, which will cause the import to crash as DHIS requires unique names. This will have to be taken care of in the GML/shapefile manually for now, will see if we can handle this better in the future.

Feedback on this is appreciated as we hope to release soon. Using the module with GeoJson as map source works as before.

Thanks to Jan and Bob for great work on GIS / import so far…

Lars

Hi Lars, I tried to import the GML file using the import option. I dont see any error in the log but in the screen it says "
No import process running". I am attaching the GML file of Orrissa which Jan have converted and given to me.

John

orissastate.gml (562 KB)

···

2010/7/15 Lars Helge Øverland larshelge@gmail.com

We are in the process of changing the GIS module in terms of how the geographical information is persisted and presented.

In the snapshot version we now store the coordinates in JSON format directly in the database on the OrganisationUnit.coordinates property. This gives us a lot more flexibility in the way maps are presented.

Previously maps had to be registered explicitly either in the form of GeoJson files or Shapefiles. Then the user had to select a map together with indicator and period. Now the user can select an orgunit from a tree and the children of that orgunit at the level below will be displayed in the map.

In large countries in India it is impossible to display a single map at the lower levels (eg. for thousands of districts) as the map will be too heavy and slow to load. Registering and managing maps for every e.g. provinces will also be too cumbersome. With the current solution there is no more work of registering and selecting maps - only the one time job of importing geographical data/coordinates into the database.

Importing is a 4 step process:

1. Convert your shapefiles (or whatever format you have) into GML.

The recommended tool is FWTools, http://fwtools.maptools.org/ . The command for converting shapefiles into GML is

ogr2ogr -F GML output.gml input.shp

(make sure you stand inside the folder containing the shape files)

Check available formats with the command ogr2ogr

2. Make sure the XML element inside the GML file which contains the orgunit name is called exactly ogr:Name (use search and replace if not), e.g.

ogr:NameBadjia</ogr:Name>

3. Import the GML file into DHIS through the regular import interface (no need to zip it)

4. In the GIS module, make sure the Administrator - Map Source setting is set to DHIS database.

In the Polygon Layer screen, you can then select the orgunit from the tree which appears by clicking on the Parent orgunit field.

Caveat: Shapefiles tend to have duplicate orgunit names, at least at the lower levels, which will cause the import to crash as DHIS requires unique names. This will have to be taken care of in the GML/shapefile manually for now, will see if we can handle this better in the future.

Feedback on this is appreciated as we hope to release soon. Using the module with GeoJson as map source works as before.

Thanks to Jan and Bob for great work on GIS / import so far…

Lars

OK are you running the latest source code from trunk, at least the dhis-service-importexport and dhis-web-importexport modules?

The file you sent imports beautifully here.

Lars

···

2010/7/17 John lewis johnlewis.hisp@gmail.com

Hi Lars, I tried to import the GML file using the import option. I dont see any error in the log but in the screen it says “No import process running”. I am attaching the GML file of Orrissa which Jan have converted and given to me.
John

2010/7/15 Lars Helge Øverland larshelge@gmail.com

We are in the process of changing the GIS module in terms of how the geographical information is persisted and presented.

In the snapshot version we now store the coordinates in JSON format directly in the database on the OrganisationUnit.coordinates property. This gives us a lot more flexibility in the way maps are presented.

Previously maps had to be registered explicitly either in the form of GeoJson files or Shapefiles. Then the user had to select a map together with indicator and period. Now the user can select an orgunit from a tree and the children of that orgunit at the level below will be displayed in the map.

In large countries in India it is impossible to display a single map at the lower levels (eg. for thousands of districts) as the map will be too heavy and slow to load. Registering and managing maps for every e.g. provinces will also be too cumbersome. With the current solution there is no more work of registering and selecting maps - only the one time job of importing geographical data/coordinates into the database.

Importing is a 4 step process:

1. Convert your shapefiles (or whatever format you have) into GML.

The recommended tool is FWTools, http://fwtools.maptools.org/ . The command for converting shapefiles into GML is

ogr2ogr -F GML output.gml input.shp

(make sure you stand inside the folder containing the shape files)

Check available formats with the command ogr2ogr

2. Make sure the XML element inside the GML file which contains the orgunit name is called exactly ogr:Name (use search and replace if not), e.g.

ogr:NameBadjia</ogr:Name>

3. Import the GML file into DHIS through the regular import interface (no need to zip it)

4. In the GIS module, make sure the Administrator - Map Source setting is set to DHIS database.

In the Polygon Layer screen, you can then select the orgunit from the tree which appears by clicking on the Parent orgunit field.

Caveat: Shapefiles tend to have duplicate orgunit names, at least at the lower levels, which will cause the import to crash as DHIS requires unique names. This will have to be taken care of in the GML/shapefile manually for now, will see if we can handle this better in the future.

Feedback on this is appreciated as we hope to release soon. Using the module with GeoJson as map source works as before.

Thanks to Jan and Bob for great work on GIS / import so far…

Lars

Import works fine here too. A quick caveat to Lars instruction:
probably just upgrading those two modules won't work in this case.
There have also been bugfixes in dhis-service-xml. And some changes
in representing coordinates which may have implications elsewhere. I
think to be safe you should just upgrade everything.

Bob

···

2010/7/17 Lars Helge Øverland <larshelge@gmail.com>:

OK are you running the latest source code from trunk, at least the
dhis-service-importexport and dhis-web-importexport modules?
The file you sent imports beautifully here.

Lars
2010/7/17 John lewis <johnlewis.hisp@gmail.com>

Hi Lars, I tried to import the GML file using the import option. I dont
see any error in the log but in the screen it says "No import process
running". I am attaching the GML file of Orrissa which Jan have converted
and given to me.
John

2010/7/15 Lars Helge Øverland <larshelge@gmail.com>

We are in the process of changing the GIS module in terms of how the
geographical information is persisted and presented.
In the snapshot version we now store the coordinates in JSON format
directly in the database on the OrganisationUnit.coordinates property. This
gives us a lot more flexibility in the way maps are presented.
Previously maps had to be registered explicitly either in the form of
GeoJson files or Shapefiles. Then the user had to select a map together with
indicator and period. Now the user can select an orgunit from a tree and the
children of that orgunit at the level below will be displayed in the map.
In large countries in India it is impossible to display a single map at
the lower levels (eg. for thousands of districts) as the map will be too
heavy and slow to load. Registering and managing maps for every e.g.
provinces will also be too cumbersome. With the current solution there is no
more work of registering and selecting maps - only the one time job of
importing geographical data/coordinates into the database.
Importing is a 4 step process:

1. Convert your shapefiles (or whatever format you have) into GML.
The recommended tool is FWTools, http://fwtools.maptools.org/ . The
command for converting shapefiles into GML is
ogr2ogr -F GML output.gml input.shp
(make sure you stand inside the folder containing the shape files)
Check available formats with the command ogr2ogr

2. Make sure the XML element inside the GML file which contains the
orgunit name is called exactly ogr:Name (use search and replace if not),
e.g.
<ogr:Name>Badjia</ogr:Name>

3. Import the GML file into DHIS through the regular import interface (no
need to zip it)

4. In the GIS module, make sure the Administrator - Map Source setting is
set to DHIS database.
In the Polygon Layer screen, you can then select the orgunit from the
tree which appears by clicking on the Parent orgunit field.

Caveat: Shapefiles tend to have duplicate orgunit names, at least at the
lower levels, which will cause the import to crash as DHIS requires unique
names. This will have to be taken care of in the GML/shapefile manually for
now, will see if we can handle this better in the future.
Feedback on this is appreciated as we hope to release soon. Using the
module with GeoJson as map source works as before.

Thanks to Jan and Bob for great work on GIS / import so far...

Lars

_______________________________________________
Mailing list: DHIS 2 developers in Launchpad
Post to : dhis2-devs@lists.launchpad.net
Unsubscribe : DHIS 2 developers in Launchpad
More help : ListHelp - Launchpad Help

Very happy to see progress on this, but I cannot quite get it to work:

I had an existing hierarchy, and imported the coordinates for Admin1.
However, when I try to see the map, I get "Form is not complete" all
the time, with no indication of what is lacking (nothing, as far as I
can see).

Also, some more discussion is needed with regard to how to handle the
import of orgunits where there is no match (for example an empty
hierarchy).

Secondly, I think it would be really valuable to also calculate and
store the midpoint for polygons. This would enable us to have to
alternative renderings of a level/layer: Either as a polygon, or as a
(proportional/colored) symbol (placed at the midpoint).

Knut

···

2010/7/14 Lars Helge Øverland <larshelge@gmail.com>:

We are in the process of changing the GIS module in terms of how the
geographical information is persisted and presented.
In the snapshot version we now store the coordinates in JSON format directly
in the database on the OrganisationUnit.coordinates property. This gives us
a lot more flexibility in the way maps are presented.
Previously maps had to be registered explicitly either in the form of
GeoJson files or Shapefiles. Then the user had to select a map together with
indicator and period. Now the user can select an orgunit from a tree and the
children of that orgunit at the level below will be displayed in the map.
In large countries in India it is impossible to display a single map at the
lower levels (eg. for thousands of districts) as the map will be too heavy
and slow to load. Registering and managing maps for every e.g. provinces
will also be too cumbersome. With the current solution there is no more work
of registering and selecting maps - only the one time job of importing
geographical data/coordinates into the database.
Importing is a 4 step process:

1. Convert your shapefiles (or whatever format you have) into GML.
The recommended tool is FWTools, http://fwtools.maptools.org/ . The command
for converting shapefiles into GML is
ogr2ogr -F GML output.gml input.shp
(make sure you stand inside the folder containing the shape files)
Check available formats with the command ogr2ogr

2. Make sure the XML element inside the GML file which contains the orgunit
name is called exactly ogr:Name (use search and replace if not), e.g.
<ogr:Name>Badjia</ogr:Name>

3. Import the GML file into DHIS through the regular import interface (no
need to zip it)

4. In the GIS module, make sure the Administrator - Map Source setting is
set to DHIS database.
In the Polygon Layer screen, you can then select the orgunit from the tree
which appears by clicking on the Parent orgunit field.

Caveat: Shapefiles tend to have duplicate orgunit names, at least at the
lower levels, which will cause the import to crash as DHIS requires unique
names. This will have to be taken care of in the GML/shapefile manually for
now, will see if we can handle this better in the future.
Feedback on this is appreciated as we hope to release soon. Using the module
with GeoJson as map source works as before.

Thanks to Jan and Bob for great work on GIS / import so far...

Lars

_______________________________________________
Mailing list: DHIS 2 developers in Launchpad
Post to : dhis2-devs@lists.launchpad.net
Unsubscribe : DHIS 2 developers in Launchpad
More help : ListHelp - Launchpad Help

--
Cheers,
Knut Staring

One problem is that the conversion to GML seems to generate very large
representations, because the GML coordinates are output with 15
decimals, whereas you would normally be happy with 5-6. Here is a
comparison of what I get in GeoJSON vs GML (converting from the same
shapefile):
GeoJSON: 38.415412, 1.750212
GML: 38.415411724082148,1.750212388592194
                                              
It also seems that there the import process puts a bit too many
brackets on points:
Example point representation (which also has a ridiculous tail of decimals):
[[[[37.270000000000003,-0.69]]]]

Example full polygon representation (gets very big when adding
hundreds of polygons into a layer):
[[[[35.241617396557501,-1.042755167363498],[35.082178302163747,-0.910721897610392],[35.016946729972211,-0.895020643910023],[35.011665399182093,-0.885742630359804],[35.024369140812389,-0.87746378749961],[35.019944242042286,-0.853483690939046],[35.045208986632879,-0.856766680349123],[35.060339285653235,-0.839352562608713],[35.077610664723643,-0.860192408429204],[35.085033075563814,-0.846061280098871],[35.139987463515105,-0.881032254249694],[35.136418996765023,-0.843206506698804],[35.168249720175773,-0.842064597338777],[35.172674618945877,-0.792391540177609],[35.202078784966567,-0.793390710867632],[35.231625689657264,-0.819083671468237],[35.262599981047991,-0.804667065797898],[35.297285477858807,-0.830645503738509],[35.334540270729683,-0.814087818018119],[35.343961022949905,-0.790393198797562],[35.27601741602831,-0.724162455916004],[35.242759305917524,-0.747857075136561],[35.241331919217494,-0.694758289895313],[35.262029026367976,-0.680056206884967],[35.291861408398681,-0.655076939634379],[35.387496317300929,-0.65935909973448],[35.397773501541174,-0.647940006134211],[35.420611688741708,-0.729301048036125],[35.481989316843155,-0.77797493450727],[35.539513000854505,-0.798101086977743],[35.434029123722027,-0.895020643910023],[35.408478901791426,-0.957111965361483],[35.360233231330291,-0.98123480059205],[35.346673057679972,-0.973812389751876],[35.241617396557501,-1.042755167363498]]]]

···

2010/7/14 Lars Helge Øverland <larshelge@gmail.com>:

We are in the process of changing the GIS module in terms of how the
geographical information is persisted and presented.
In the snapshot version we now store the coordinates in JSON format directly
in the database on the OrganisationUnit.coordinates property. This gives us
a lot more flexibility in the way maps are presented.
Previously maps had to be registered explicitly either in the form of
GeoJson files or Shapefiles. Then the user had to select a map together with
indicator and period. Now the user can select an orgunit from a tree and the
children of that orgunit at the level below will be displayed in the map.
In large countries in India it is impossible to display a single map at the
lower levels (eg. for thousands of districts) as the map will be too heavy
and slow to load. Registering and managing maps for every e.g. provinces
will also be too cumbersome. With the current solution there is no more work
of registering and selecting maps - only the one time job of importing
geographical data/coordinates into the database.
Importing is a 4 step process:

1. Convert your shapefiles (or whatever format you have) into GML.
The recommended tool is FWTools, http://fwtools.maptools.org/ . The command
for converting shapefiles into GML is
ogr2ogr -F GML output.gml input.shp

Having 15 decimals is of course ridiculous in our case. The surface distance per 1 degree change in latitude is approximately 111 km, so 15 decimals means an accuracy of 0.000000111 mm. I think five decimals should be appropriate (1 meter) or even four (11 meter) or three (111 meter)? Could this be done during the db insertion?

···

2010/7/23 Knut Staring knutst@gmail.com

2010/7/14 Lars Helge Øverland larshelge@gmail.com:

We are in the process of changing the GIS module in terms of how the

geographical information is persisted and presented.

In the snapshot version we now store the coordinates in JSON format directly

in the database on the OrganisationUnit.coordinates property. This gives us

a lot more flexibility in the way maps are presented.

Previously maps had to be registered explicitly either in the form of

GeoJson files or Shapefiles. Then the user had to select a map together with

indicator and period. Now the user can select an orgunit from a tree and the

children of that orgunit at the level below will be displayed in the map.

In large countries in India it is impossible to display a single map at the

lower levels (eg. for thousands of districts) as the map will be too heavy

and slow to load. Registering and managing maps for every e.g. provinces

will also be too cumbersome. With the current solution there is no more work

of registering and selecting maps - only the one time job of importing

geographical data/coordinates into the database.

Importing is a 4 step process:

  1. Convert your shapefiles (or whatever format you have) into GML.

The recommended tool is FWTools, http://fwtools.maptools.org/ . The command

for converting shapefiles into GML is

ogr2ogr -F GML output.gml input.shp

One problem is that the conversion to GML seems to generate very large

representations, because the GML coordinates are output with 15

decimals, whereas you would normally be happy with 5-6. Here is a

comparison of what I get in GeoJSON vs GML (converting from the same

shapefile):

GeoJSON: 38.415412, 1.750212

GML: 38.415411724082148,1.750212388592194

It also seems that there the import process puts a bit too many

brackets on points:

Example point representation (which also has a ridiculous tail of decimals):

[[[[37.270000000000003,-0.69]]]]

Example full polygon representation (gets very big when adding

hundreds of polygons into a layer):

[[[[35.241617396557501,-1.042755167363498],[35.082178302163747,-0.910721897610392],[35.016946729972211,-0.895020643910023],[35.011665399182093,-0.885742630359804],[35.024369140812389,-0.87746378749961],[35.019944242042286,-0.853483690939046],[35.045208986632879,-0.856766680349123],[35.060339285653235,-0.839352562608713],[35.077610664723643,-0.860192408429204],[35.085033075563814,-0.846061280098871],[35.139987463515105,-0.881032254249694],[35.136418996765023,-0.843206506698804],[35.168249720175773,-0.842064597338777],[35.172674618945877,-0.792391540177609],[35.202078784966567,-0.793390710867632],[35.231625689657264,-0.819083671468237],[35.262599981047991,-0.804667065797898],[35.297285477858807,-0.830645503738509],[35.334540270729683,-0.814087818018119],[35.343961022949905,-0.790393198797562],[35.27601741602831,-0.724162455916004],[35.242759305917524,-0.747857075136561],[35.241331919217494,-0.694758289895313],[35.262029026367976,-0.680056206884967],[35.291861408398681,-0.655076939634379],[35.387496317300929,-0.65935909973448],[35.397773501541174,-0.647940006134211],[35.420611688741708,-0.729301048036125],[35.481989316843155,-0.77797493450727],[35.539513000854505,-0.798101086977743],[35.434029123722027,-0.895020643910023],[35.408478901791426,-0.957111965361483],[35.360233231330291,-0.98123480059205],[35.346673057679972,-0.973812389751876],[35.241617396557501,-1.042755167363498]]]]


Mailing list: https://launchpad.net/~dhis2-devs

Post to : dhis2-devs@lists.launchpad.net

Unsubscribe : https://launchpad.net/~dhis2-devs

More help : https://help.launchpad.net/ListHelp

We are in the process of changing the GIS module in terms of how the
geographical information is persisted and presented.
In the snapshot version we now store the coordinates in JSON format directly
in the database on the OrganisationUnit.coordinates property. This gives us
a lot more flexibility in the way maps are presented.
Previously maps had to be registered explicitly either in the form of
GeoJson files or Shapefiles. Then the user had to select a map together with
indicator and period. Now the user can select an orgunit from a tree and the
children of that orgunit at the level below will be displayed in the map.
In large countries in India it is impossible to display a single map at the
lower levels (eg. for thousands of districts) as the map will be too heavy
and slow to load. Registering and managing maps for every e.g. provinces
will also be too cumbersome. With the current solution there is no more work
of registering and selecting maps - only the one time job of importing
geographical data/coordinates into the database.
Importing is a 4 step process:

1. Convert your shapefiles (or whatever format you have) into GML.
The recommended tool is FWTools, http://fwtools.maptools.org/ . The command
for converting shapefiles into GML is
ogr2ogr -F GML output.gml input.shp

One problem is that the conversion to GML seems to generate very large
representations, because the GML coordinates are output with 15
decimals, whereas you would normally be happy with 5-6. Here is a
comparison of what I get in GeoJSON vs GML (converting from the same
shapefile):
GeoJSON: 38.415412, 1.750212
GML: 38.415411724082148,1.750212388592194

I think this is just a question of understanding how many decimal
points are required/optimal/acceptable or what have you. If you think
4 or 5 or 6 is enough then I guess this can be reduced in the xslt
transform.

Just saw Jan's mail. 3 is fine as well. It could be done at gml2dxf
stage or at db insertion. Swings 'n roundabouts. The earlier in the
process the better from an efficiency perspective. But java rounding
is maybe more efficient than xpath function. Perhaps the best of both
worlds might be a simple java rounder function available to the xslt.
Ideally would be a parameter to ogr2ogr but we probably don't want our
own custom binary.

Bob

···

2010/7/23 Knut Staring <knutst@gmail.com>:

2010/7/14 Lars Helge Øverland <larshelge@gmail.com>:

It also seems that there the import process puts a bit too many
brackets on points:
Example point representation (which also has a ridiculous tail of decimals):
[[[[37.270000000000003,-0.69]]]]

Example full polygon representation (gets very big when adding
hundreds of polygons into a layer):
[[[[35.241617396557501,-1.042755167363498],[35.082178302163747,-0.910721897610392],[35.016946729972211,-0.895020643910023],[35.011665399182093,-0.885742630359804],[35.024369140812389,-0.87746378749961],[35.019944242042286,-0.853483690939046],[35.045208986632879,-0.856766680349123],[35.060339285653235,-0.839352562608713],[35.077610664723643,-0.860192408429204],[35.085033075563814,-0.846061280098871],[35.139987463515105,-0.881032254249694],[35.136418996765023,-0.843206506698804],[35.168249720175773,-0.842064597338777],[35.172674618945877,-0.792391540177609],[35.202078784966567,-0.793390710867632],[35.231625689657264,-0.819083671468237],[35.262599981047991,-0.804667065797898],[35.297285477858807,-0.830645503738509],[35.334540270729683,-0.814087818018119],[35.343961022949905,-0.790393198797562],[35.27601741602831,-0.724162455916004],[35.242759305917524,-0.747857075136561],[35.241331919217494,-0.694758289895313],[35.262029026367976,-0.680056206884967],[35.291861408398681,-0.655076939634379],[35.387496317300929,-0.65935909973448],[35.397773501541174,-0.647940006134211],[35.420611688741708,-0.729301048036125],[35.481989316843155,-0.77797493450727],[35.539513000854505,-0.798101086977743],[35.434029123722027,-0.895020643910023],[35.408478901791426,-0.957111965361483],[35.360233231330291,-0.98123480059205],[35.346673057679972,-0.973812389751876],[35.241617396557501,-1.042755167363498]]]]

_______________________________________________
Mailing list: DHIS 2 developers in Launchpad
Post to : dhis2-devs@lists.launchpad.net
Unsubscribe : DHIS 2 developers in Launchpad
More help : ListHelp - Launchpad Help

There is probably some difference between polygons and points. For the
location of health facilities (points), I think it makes sense to
retain 4-5 decimals. Polygons (which obviously have a lot more
coordinates) can probably do fine with 3-4. Let us settle for 4
overall for now, and then ppl like Jan and I should do some testing.

k

···

On Fri, Jul 23, 2010 at 9:54 PM, Bob Jolliffe <bobjolliffe@gmail.com> wrote:

2010/7/23 Knut Staring <knutst@gmail.com>:

2010/7/14 Lars Helge Øverland <larshelge@gmail.com>:

We are in the process of changing the GIS module in terms of how the
geographical information is persisted and presented.
In the snapshot version we now store the coordinates in JSON format directly
in the database on the OrganisationUnit.coordinates property. This gives us
a lot more flexibility in the way maps are presented.
Previously maps had to be registered explicitly either in the form of
GeoJson files or Shapefiles. Then the user had to select a map together with
indicator and period. Now the user can select an orgunit from a tree and the
children of that orgunit at the level below will be displayed in the map.
In large countries in India it is impossible to display a single map at the
lower levels (eg. for thousands of districts) as the map will be too heavy
and slow to load. Registering and managing maps for every e.g. provinces
will also be too cumbersome. With the current solution there is no more work
of registering and selecting maps - only the one time job of importing
geographical data/coordinates into the database.
Importing is a 4 step process:

1. Convert your shapefiles (or whatever format you have) into GML.
The recommended tool is FWTools, http://fwtools.maptools.org/ . The command
for converting shapefiles into GML is
ogr2ogr -F GML output.gml input.shp

One problem is that the conversion to GML seems to generate very large
representations, because the GML coordinates are output with 15
decimals, whereas you would normally be happy with 5-6. Here is a
comparison of what I get in GeoJSON vs GML (converting from the same
shapefile):
GeoJSON: 38.415412, 1.750212
GML: 38.415411724082148,1.750212388592194

I think this is just a question of understanding how many decimal
points are required/optimal/acceptable or what have you. If you think
4 or 5 or 6 is enough then I guess this can be reduced in the xslt
transform.

Just saw Jan's mail. 3 is fine as well. It could be done at gml2dxf
stage or at db insertion. Swings 'n roundabouts. The earlier in the
process the better from an efficiency perspective. But java rounding
is maybe more efficient than xpath function. Perhaps the best of both
worlds might be a simple java rounder function available to the xslt.
Ideally would be a parameter to ogr2ogr but we probably don't want our
own custom binary.

Bob

It also seems that there the import process puts a bit too many
brackets on points:
Example point representation (which also has a ridiculous tail of decimals):
[[[[37.270000000000003,-0.69]]]]

Example full polygon representation (gets very big when adding
hundreds of polygons into a layer):
[[[[35.241617396557501,-1.042755167363498],[35.082178302163747,-0.910721897610392],[35.016946729972211,-0.895020643910023],[35.011665399182093,-0.885742630359804],[35.024369140812389,-0.87746378749961],[35.019944242042286,-0.853483690939046],[35.045208986632879,-0.856766680349123],[35.060339285653235,-0.839352562608713],[35.077610664723643,-0.860192408429204],[35.085033075563814,-0.846061280098871],[35.139987463515105,-0.881032254249694],[35.136418996765023,-0.843206506698804],[35.168249720175773,-0.842064597338777],[35.172674618945877,-0.792391540177609],[35.202078784966567,-0.793390710867632],[35.231625689657264,-0.819083671468237],[35.262599981047991,-0.804667065797898],[35.297285477858807,-0.830645503738509],[35.334540270729683,-0.814087818018119],[35.343961022949905,-0.790393198797562],[35.27601741602831,-0.724162455916004],[35.242759305917524,-0.747857075136561],[35.241331919217494,-0.694758289895313],[35.262029026367976,-0.680056206884967],[35.291861408398681,-0.655076939634379],[35.387496317300929,-0.65935909973448],[35.397773501541174,-0.647940006134211],[35.420611688741708,-0.729301048036125],[35.481989316843155,-0.77797493450727],[35.539513000854505,-0.798101086977743],[35.434029123722027,-0.895020643910023],[35.408478901791426,-0.957111965361483],[35.360233231330291,-0.98123480059205],[35.346673057679972,-0.973812389751876],[35.241617396557501,-1.042755167363498]]]]

_______________________________________________
Mailing list: DHIS 2 developers in Launchpad
Post to : dhis2-devs@lists.launchpad.net
Unsubscribe : DHIS 2 developers in Launchpad
More help : ListHelp - Launchpad Help

--
Cheers,
Knut Staring

It obviously also depends a bit on the scale - if we are talking about
world maps and just national borders, in my experience 2 decimals is
more than enough. But for a subdistrict map I think you would at least
want 3.

k

···

On Fri, Jul 23, 2010 at 10:11 PM, Knut Staring <knutst@gmail.com> wrote:

There is probably some difference between polygons and points. For the location of health facilities (points), I think it makes sense to retain 4-5 decimals. Polygons (which obviously have a lot more coordinates) can probably do fine with 3-4. Let us settle for 4 overall for now, and then ppl like Jan and I should do some testing.

k

On Fri, Jul 23, 2010 at 9:54 PM, Bob Jolliffe <bobjolliffe@gmail.com> wrote:

2010/7/23 Knut Staring <knutst@gmail.com>:

2010/7/14 Lars Helge Øverland <larshelge@gmail.com>:

We are in the process of changing the GIS module in terms of how the
geographical information is persisted and presented.
In the snapshot version we now store the coordinates in JSON format directly
in the database on the OrganisationUnit.coordinates property. This gives us
a lot more flexibility in the way maps are presented.
Previously maps had to be registered explicitly either in the form of
GeoJson files or Shapefiles. Then the user had to select a map together with
indicator and period. Now the user can select an orgunit from a tree and the
children of that orgunit at the level below will be displayed in the map.
In large countries in India it is impossible to display a single map at the
lower levels (eg. for thousands of districts) as the map will be too heavy
and slow to load. Registering and managing maps for every e.g. provinces
will also be too cumbersome. With the current solution there is no more work
of registering and selecting maps - only the one time job of importing
geographical data/coordinates into the database.
Importing is a 4 step process:

1. Convert your shapefiles (or whatever format you have) into GML.
The recommended tool is FWTools, http://fwtools.maptools.org/ . The command
for converting shapefiles into GML is
ogr2ogr -F GML output.gml input.shp

One problem is that the conversion to GML seems to generate very large
representations, because the GML coordinates are output with 15
decimals, whereas you would normally be happy with 5-6. Here is a
comparison of what I get in GeoJSON vs GML (converting from the same
shapefile):
GeoJSON: 38.415412, 1.750212
GML: 38.415411724082148,1.750212388592194

I think this is just a question of understanding how many decimal
points are required/optimal/acceptable or what have you. If you think
4 or 5 or 6 is enough then I guess this can be reduced in the xslt
transform.

Just saw Jan's mail. 3 is fine as well. It could be done at gml2dxf
stage or at db insertion. Swings 'n roundabouts. The earlier in the
process the better from an efficiency perspective. But java rounding
is maybe more efficient than xpath function. Perhaps the best of both
worlds might be a simple java rounder function available to the xslt.
Ideally would be a parameter to ogr2ogr but we probably don't want our
own custom binary.

Bob

It also seems that there the import process puts a bit too many
brackets on points:
Example point representation (which also has a ridiculous tail of decimals):
[[[[37.270000000000003,-0.69]]]]

Example full polygon representation (gets very big when adding
hundreds of polygons into a layer):
[[[[35.241617396557501,-1.042755167363498],[35.082178302163747,-0.910721897610392],[35.016946729972211,-0.895020643910023],[35.011665399182093,-0.885742630359804],[35.024369140812389,-0.87746378749961],[35.019944242042286,-0.853483690939046],[35.045208986632879,-0.856766680349123],[35.060339285653235,-0.839352562608713],[35.077610664723643,-0.860192408429204],[35.085033075563814,-0.846061280098871],[35.139987463515105,-0.881032254249694],[35.136418996765023,-0.843206506698804],[35.168249720175773,-0.842064597338777],[35.172674618945877,-0.792391540177609],[35.202078784966567,-0.793390710867632],[35.231625689657264,-0.819083671468237],[35.262599981047991,-0.804667065797898],[35.297285477858807,-0.830645503738509],[35.334540270729683,-0.814087818018119],[35.343961022949905,-0.790393198797562],[35.27601741602831,-0.724162455916004],[35.242759305917524,-0.747857075136561],[35.241331919217494,-0.694758289895313],[35.262029026367976,-0.680056206884967],[35.291861408398681,-0.655076939634379],[35.387496317300929,-0.65935909973448],[35.397773501541174,-0.647940006134211],[35.420611688741708,-0.729301048036125],[35.481989316843155,-0.77797493450727],[35.539513000854505,-0.798101086977743],[35.434029123722027,-0.895020643910023],[35.408478901791426,-0.957111965361483],[35.360233231330291,-0.98123480059205],[35.346673057679972,-0.973812389751876],[35.241617396557501,-1.042755167363498]]]]

_______________________________________________
Mailing list: DHIS 2 developers in Launchpad
Post to : dhis2-devs@lists.launchpad.net
Unsubscribe : DHIS 2 developers in Launchpad
More help : ListHelp - Launchpad Help

--
Cheers,
Knut Staring

--
Cheers,
Knut Staring

In fact the xpath expression will be a bit tricky. xpath is not
really well suited to this kind of data where the coordinates are not
"proper" xml elements but are encoded in a string. In fact right the
way through the process from gml to database we don't ever see the
coordinates as numbers. (Pet gripe about gml - that's really bad from
an xml processing perspective). So they have to be tokenized first,
then rounded with a pretty primitive xpath 1.0 function then
re-concatenated. I would do that armed with a stiff whiskey but not
otherwise.

I think the better place to round these values will in fact (as Jan
suggested) be in the orgunit converter. That is the one place we do
tokenize the coordinates anyway to put '['s around them. Can't
promise to look at this right now but put it in a blueprint lest we
lose it (I guess its not really a bug).

Cheers
Bob

···

On 23 July 2010 21:13, Knut Staring <knutst@gmail.com> wrote:

It obviously also depends a bit on the scale - if we are talking about
world maps and just national borders, in my experience 2 decimals is
more than enough. But for a subdistrict map I think you would at least
want 3.

k

On Fri, Jul 23, 2010 at 10:11 PM, Knut Staring <knutst@gmail.com> wrote:

There is probably some difference between polygons and points. For the location of health facilities (points), I think it makes sense to retain 4-5 decimals. Polygons (which obviously have a lot more coordinates) can probably do fine with 3-4. Let us settle for 4 overall for now, and then ppl like Jan and I should do some testing.

k

On Fri, Jul 23, 2010 at 9:54 PM, Bob Jolliffe <bobjolliffe@gmail.com> wrote:

2010/7/23 Knut Staring <knutst@gmail.com>:

2010/7/14 Lars Helge Øverland <larshelge@gmail.com>:

We are in the process of changing the GIS module in terms of how the
geographical information is persisted and presented.
In the snapshot version we now store the coordinates in JSON format directly
in the database on the OrganisationUnit.coordinates property. This gives us
a lot more flexibility in the way maps are presented.
Previously maps had to be registered explicitly either in the form of
GeoJson files or Shapefiles. Then the user had to select a map together with
indicator and period. Now the user can select an orgunit from a tree and the
children of that orgunit at the level below will be displayed in the map.
In large countries in India it is impossible to display a single map at the
lower levels (eg. for thousands of districts) as the map will be too heavy
and slow to load. Registering and managing maps for every e.g. provinces
will also be too cumbersome. With the current solution there is no more work
of registering and selecting maps - only the one time job of importing
geographical data/coordinates into the database.
Importing is a 4 step process:

1. Convert your shapefiles (or whatever format you have) into GML.
The recommended tool is FWTools, http://fwtools.maptools.org/ . The command
for converting shapefiles into GML is
ogr2ogr -F GML output.gml input.shp

One problem is that the conversion to GML seems to generate very large
representations, because the GML coordinates are output with 15
decimals, whereas you would normally be happy with 5-6. Here is a
comparison of what I get in GeoJSON vs GML (converting from the same
shapefile):
GeoJSON: 38.415412, 1.750212
GML: 38.415411724082148,1.750212388592194

I think this is just a question of understanding how many decimal
points are required/optimal/acceptable or what have you. If you think
4 or 5 or 6 is enough then I guess this can be reduced in the xslt
transform.

Just saw Jan's mail. 3 is fine as well. It could be done at gml2dxf
stage or at db insertion. Swings 'n roundabouts. The earlier in the
process the better from an efficiency perspective. But java rounding
is maybe more efficient than xpath function. Perhaps the best of both
worlds might be a simple java rounder function available to the xslt.
Ideally would be a parameter to ogr2ogr but we probably don't want our
own custom binary.

Bob

It also seems that there the import process puts a bit too many
brackets on points:
Example point representation (which also has a ridiculous tail of decimals):
[[[[37.270000000000003,-0.69]]]]

Example full polygon representation (gets very big when adding
hundreds of polygons into a layer):
[[[[35.241617396557501,-1.042755167363498],[35.082178302163747,-0.910721897610392],[35.016946729972211,-0.895020643910023],[35.011665399182093,-0.885742630359804],[35.024369140812389,-0.87746378749961],[35.019944242042286,-0.853483690939046],[35.045208986632879,-0.856766680349123],[35.060339285653235,-0.839352562608713],[35.077610664723643,-0.860192408429204],[35.085033075563814,-0.846061280098871],[35.139987463515105,-0.881032254249694],[35.136418996765023,-0.843206506698804],[35.168249720175773,-0.842064597338777],[35.172674618945877,-0.792391540177609],[35.202078784966567,-0.793390710867632],[35.231625689657264,-0.819083671468237],[35.262599981047991,-0.804667065797898],[35.297285477858807,-0.830645503738509],[35.334540270729683,-0.814087818018119],[35.343961022949905,-0.790393198797562],[35.27601741602831,-0.724162455916004],[35.242759305917524,-0.747857075136561],[35.241331919217494,-0.694758289895313],[35.262029026367976,-0.680056206884967],[35.291861408398681,-0.655076939634379],[35.387496317300929,-0.65935909973448],[35.397773501541174,-0.647940006134211],[35.420611688741708,-0.729301048036125],[35.481989316843155,-0.77797493450727],[35.539513000854505,-0.798101086977743],[35.434029123722027,-0.895020643910023],[35.408478901791426,-0.957111965361483],[35.360233231330291,-0.98123480059205],[35.346673057679972,-0.973812389751876],[35.241617396557501,-1.042755167363498]]]]

_______________________________________________
Mailing list: DHIS 2 developers in Launchpad
Post to : dhis2-devs@lists.launchpad.net
Unsubscribe : DHIS 2 developers in Launchpad
More help : ListHelp - Launchpad Help

--
Cheers,
Knut Staring

--
Cheers,
Knut Staring

Not sure really where this is headed, but I am not 100% sure I am
comfortable with the direction of the conversation. Let me intervene.

We need to think about a few things here.

We are currently only supporting EPSG 4326. This is likely only
because we have really not needed any other projection (yet) but I
suspect this is going to bite us on the ass sooner or later. If we
think about other projections (UTM for instance) there are not always
decimals. If we think about the use of DHIS as as data collection tool
for humanitarian situations, or even in countries with an established
master facility list that does not use 4326 as a native projection,
then we should think about how we can extend the use of DHIS for these
use cases. Force them to transform the data prior to importation or
to use another data collection tool?

If there is not a strong enough use case (which right now it seems
there is not) perhaps it is not really an issue. It just sort of makes
me a bit uncomfortable, when we have to potentially do some gymnastics
to get data in a format that is fast enough for mapping purposes.
Generalization and reprojection of GIS data is pretty much part and
parcel when it comes to mapping of any GIS data. If DHIS requires data
in a certain format, it should be able to transform more precise data
in arbitrary projections into its own requirements. If you look at
FAOs KIDS application, they used a custom binary format. Not because
it was standards compliant, just because it was really damn fast to
read and render. I am not saying we need to go that route, but just to
highlight that if we are focused on using DHIS as a presentation
mechanism instead of a repository, then we should focus on that.

Ideally, DHIS should be able to consume any GML stream and convert it
into a format that is required. If a user attempts to import a stream
that is not EPSG 4326, it should be able to be converted and "trimmed"
into a format that is appropriate for the application. I would also
say that DHIS should eventually should be able to "push" data to a
repository, such as Geoserver which is much more suited to handling
transformation and presentation of many different types of data.
Geoserver however is built upon GeoTools, which should be more than
capable of handling reprojection, generalization and transformation of
polygons to centroids, regardless of which back-end database system is
being used.

Regards,
Jason

···

On Fri, Jul 23, 2010 at 10:31 PM, Bob Jolliffe <bobjolliffe@gmail.com> wrote:

In fact the xpath expression will be a bit tricky. xpath is not
really well suited to this kind of data where the coordinates are not
"proper" xml elements but are encoded in a string. In fact right the
way through the process from gml to database we don't ever see the
coordinates as numbers. (Pet gripe about gml - that's really bad from
an xml processing perspective). So they have to be tokenized first,
then rounded with a pretty primitive xpath 1.0 function then
re-concatenated. I would do that armed with a stiff whiskey but not
otherwise.

I think the better place to round these values will in fact (as Jan
suggested) be in the orgunit converter. That is the one place we do
tokenize the coordinates anyway to put '['s around them. Can't
promise to look at this right now but put it in a blueprint lest we
lose it (I guess its not really a bug).

Cheers
Bob

On 23 July 2010 21:13, Knut Staring <knutst@gmail.com> wrote:

It obviously also depends a bit on the scale - if we are talking about
world maps and just national borders, in my experience 2 decimals is
more than enough. But for a subdistrict map I think you would at least
want 3.

k

On Fri, Jul 23, 2010 at 10:11 PM, Knut Staring <knutst@gmail.com> wrote:

There is probably some difference between polygons and points. For the location of health facilities (points), I think it makes sense to retain 4-5 decimals. Polygons (which obviously have a lot more coordinates) can probably do fine with 3-4. Let us settle for 4 overall for now, and then ppl like Jan and I should do some testing.

k

On Fri, Jul 23, 2010 at 9:54 PM, Bob Jolliffe <bobjolliffe@gmail.com> wrote:

2010/7/23 Knut Staring <knutst@gmail.com>:

2010/7/14 Lars Helge Øverland <larshelge@gmail.com>:

We are in the process of changing the GIS module in terms of how the
geographical information is persisted and presented.
In the snapshot version we now store the coordinates in JSON format directly
in the database on the OrganisationUnit.coordinates property. This gives us
a lot more flexibility in the way maps are presented.
Previously maps had to be registered explicitly either in the form of
GeoJson files or Shapefiles. Then the user had to select a map together with
indicator and period. Now the user can select an orgunit from a tree and the
children of that orgunit at the level below will be displayed in the map.
In large countries in India it is impossible to display a single map at the
lower levels (eg. for thousands of districts) as the map will be too heavy
and slow to load. Registering and managing maps for every e.g. provinces
will also be too cumbersome. With the current solution there is no more work
of registering and selecting maps - only the one time job of importing
geographical data/coordinates into the database.
Importing is a 4 step process:

1. Convert your shapefiles (or whatever format you have) into GML.
The recommended tool is FWTools, http://fwtools.maptools.org/ . The command
for converting shapefiles into GML is
ogr2ogr -F GML output.gml input.shp

One problem is that the conversion to GML seems to generate very large
representations, because the GML coordinates are output with 15
decimals, whereas you would normally be happy with 5-6. Here is a
comparison of what I get in GeoJSON vs GML (converting from the same
shapefile):
GeoJSON: 38.415412, 1.750212
GML: 38.415411724082148,1.750212388592194

I think this is just a question of understanding how many decimal
points are required/optimal/acceptable or what have you. If you think
4 or 5 or 6 is enough then I guess this can be reduced in the xslt
transform.

Just saw Jan's mail. 3 is fine as well. It could be done at gml2dxf
stage or at db insertion. Swings 'n roundabouts. The earlier in the
process the better from an efficiency perspective. But java rounding
is maybe more efficient than xpath function. Perhaps the best of both
worlds might be a simple java rounder function available to the xslt.
Ideally would be a parameter to ogr2ogr but we probably don't want our
own custom binary.

Bob

It also seems that there the import process puts a bit too many
brackets on points:
Example point representation (which also has a ridiculous tail of decimals):
[[[[37.270000000000003,-0.69]]]]

Example full polygon representation (gets very big when adding
hundreds of polygons into a layer):
[[[[35.241617396557501,-1.042755167363498],[35.082178302163747,-0.910721897610392],[35.016946729972211,-0.895020643910023],[35.011665399182093,-0.885742630359804],[35.024369140812389,-0.87746378749961],[35.019944242042286,-0.853483690939046],[35.045208986632879,-0.856766680349123],[35.060339285653235,-0.839352562608713],[35.077610664723643,-0.860192408429204],[35.085033075563814,-0.846061280098871],[35.139987463515105,-0.881032254249694],[35.136418996765023,-0.843206506698804],[35.168249720175773,-0.842064597338777],[35.172674618945877,-0.792391540177609],[35.202078784966567,-0.793390710867632],[35.231625689657264,-0.819083671468237],[35.262599981047991,-0.804667065797898],[35.297285477858807,-0.830645503738509],[35.334540270729683,-0.814087818018119],[35.343961022949905,-0.790393198797562],[35.27601741602831,-0.724162455916004],[35.242759305917524,-0.747857075136561],[35.241331919217494,-0.694758289895313],[35.262029026367976,-0.680056206884967],[35.291861408398681,-0.655076939634379],[35.387496317300929,-0.65935909973448],[35.397773501541174,-0.647940006134211],[35.420611688741708,-0.729301048036125],[35.481989316843155,-0.77797493450727],[35.539513000854505,-0.798101086977743],[35.434029123722027,-0.895020643910023],[35.408478901791426,-0.957111965361483],[35.360233231330291,-0.98123480059205],[35.346673057679972,-0.973812389751876],[35.241617396557501,-1.042755167363498]]]]

_______________________________________________
Mailing list: https://launchpad.net/~dhis2-devs
Post to : dhis2-devs@lists.launchpad.net
Unsubscribe : https://launchpad.net/~dhis2-devs
More help : https://help.launchpad.net/ListHelp

--
Cheers,
Knut Staring

--
Cheers,
Knut Staring

_______________________________________________
Mailing list: https://launchpad.net/~dhis2-devs
Post to : dhis2-devs@lists.launchpad.net
Unsubscribe : https://launchpad.net/~dhis2-devs
More help : https://help.launchpad.net/ListHelp

--
Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+17069260025

Not sure really where this is headed, but I am not 100% sure I am
comfortable with the direction of the conversation. Let me intervene.

We need to think about a few things here.

We are currently only supporting EPSG 4326. This is likely only
because we have really not needed any other projection (yet) but I
suspect this is going to bite us on the ass sooner or later. If we
think about other projections (UTM for instance) there are not always
decimals. If we think about the use of DHIS as as data collection tool
for humanitarian situations, or even in countries with an established
master facility list that does not use 4326 as a native projection,
then we should think about how we can extend the use of DHIS for these
use cases. Force them to transform the data prior to importation or
to use another data collection tool?

I understand your concern for the wider use of DHIS with us currently
pretty much ignoring the excellent real GIS tools and libraries out
there. But for now, I think external transformations with ogr2ogr will
work ok, since reprojection is basically just one more parameter in
the one line conversion to GML. We have already come across UTM36S
being used as a standard in Malawi. As you say below, we could easily
add some of this with GeoTools - but we have to prioritize. Plenty of
student projects around this, though, especially if you have some time
to guide them.

If there is not a strong enough use case (which right now it seems
there is not) perhaps it is not really an issue. It just sort of makes
me a bit uncomfortable, when we have to potentially do some gymnastics
to get data in a format that is fast enough for mapping purposes.
Generalization and reprojection of GIS data is pretty much part and
parcel when it comes to mapping of any GIS data. If DHIS requires data
in a certain format, it should be able to transform more precise data
in arbitrary projections into its own requirements.

If you look at
FAOs KIDS application, they used a custom binary format. Not because
it was standards compliant, just because it was really damn fast to
read and render. I am not saying we need to go that route, but just to
highlight that if we are focused on using DHIS as a presentation
mechanism instead of a repository, then we should focus on that.

It may be that we want to use DHIS as both a repository with full
precision (though not ridiculously artifical ones like 15 decimal
lat/lon) and have a faster way of renderin. But for a repo, I think
something like PostGIS is in order. Or we could just store things as
GML...

Ideally, DHIS should be able to consume any GML stream and convert it
into a format that is required. If a user attempts to import a stream
that is not EPSG 4326, it should be able to be converted and "trimmed"
into a format that is appropriate for the application. I would also
say that DHIS should eventually should be able to "push" data to a
repository, such as Geoserver which is much more suited to handling
transformation and presentation of many different types of data.

We should be very conscient of not pushing the new, very simple
solution too far, for more complex functionality we should rather
employ Geoserver and PostGIS - and I still think this is the best
solution for a national repository. Our new way of storing orgunit
boundaries is a very small subset of such a full blown GIS solution,
but has the advantage of being simple, lightweight and portable.

Geoserver however is built upon GeoTools, which should be more than
capable of handling reprojection, generalization and transformation of
polygons to centroids, regardless of which back-end database system is
being used.

Agree that some incorporation of GeoTools might be in order. In fact,
a GIS application using GeoTools was developed back in 2005 by the
master students Trond Andresen and Lars Gunnar Vik:

http://www.duo.uio.no/sok/work.html?WORKID=27723
http://folk.uio.no/trondand/hispgis/

However, for more complex functionality such as handling raster
layers, an optional integration with Geoserver is probably the way to
go.

Knut

···

On Sat, Jul 24, 2010 at 5:15 PM, Jason Pickering <jason.p.pickering@gmail.com> wrote:

Regards,
Jason

On Fri, Jul 23, 2010 at 10:31 PM, Bob Jolliffe <bobjolliffe@gmail.com> wrote:

In fact the xpath expression will be a bit tricky. xpath is not
really well suited to this kind of data where the coordinates are not
"proper" xml elements but are encoded in a string. In fact right the
way through the process from gml to database we don't ever see the
coordinates as numbers. (Pet gripe about gml - that's really bad from
an xml processing perspective). So they have to be tokenized first,
then rounded with a pretty primitive xpath 1.0 function then
re-concatenated. I would do that armed with a stiff whiskey but not
otherwise.

I think the better place to round these values will in fact (as Jan
suggested) be in the orgunit converter. That is the one place we do
tokenize the coordinates anyway to put '['s around them. Can't
promise to look at this right now but put it in a blueprint lest we
lose it (I guess its not really a bug).

Cheers
Bob

On 23 July 2010 21:13, Knut Staring <knutst@gmail.com> wrote:

It obviously also depends a bit on the scale - if we are talking about
world maps and just national borders, in my experience 2 decimals is
more than enough. But for a subdistrict map I think you would at least
want 3.

k

On Fri, Jul 23, 2010 at 10:11 PM, Knut Staring <knutst@gmail.com> wrote:

There is probably some difference between polygons and points. For the location of health facilities (points), I think it makes sense to retain 4-5 decimals. Polygons (which obviously have a lot more coordinates) can probably do fine with 3-4. Let us settle for 4 overall for now, and then ppl like Jan and I should do some testing.

k

On Fri, Jul 23, 2010 at 9:54 PM, Bob Jolliffe <bobjolliffe@gmail.com> wrote:

2010/7/23 Knut Staring <knutst@gmail.com>:

2010/7/14 Lars Helge Øverland <larshelge@gmail.com>:

We are in the process of changing the GIS module in terms of how the
geographical information is persisted and presented.
In the snapshot version we now store the coordinates in JSON format directly
in the database on the OrganisationUnit.coordinates property. This gives us
a lot more flexibility in the way maps are presented.
Previously maps had to be registered explicitly either in the form of
GeoJson files or Shapefiles. Then the user had to select a map together with
indicator and period. Now the user can select an orgunit from a tree and the
children of that orgunit at the level below will be displayed in the map.
In large countries in India it is impossible to display a single map at the
lower levels (eg. for thousands of districts) as the map will be too heavy
and slow to load. Registering and managing maps for every e.g. provinces
will also be too cumbersome. With the current solution there is no more work
of registering and selecting maps - only the one time job of importing
geographical data/coordinates into the database.
Importing is a 4 step process:

1. Convert your shapefiles (or whatever format you have) into GML.
The recommended tool is FWTools, http://fwtools.maptools.org/ . The command
for converting shapefiles into GML is
ogr2ogr -F GML output.gml input.shp

One problem is that the conversion to GML seems to generate very large
representations, because the GML coordinates are output with 15
decimals, whereas you would normally be happy with 5-6. Here is a
comparison of what I get in GeoJSON vs GML (converting from the same
shapefile):
GeoJSON: 38.415412, 1.750212
GML: 38.415411724082148,1.750212388592194

I think this is just a question of understanding how many decimal
points are required/optimal/acceptable or what have you. If you think
4 or 5 or 6 is enough then I guess this can be reduced in the xslt
transform.

Just saw Jan's mail. 3 is fine as well. It could be done at gml2dxf
stage or at db insertion. Swings 'n roundabouts. The earlier in the
process the better from an efficiency perspective. But java rounding
is maybe more efficient than xpath function. Perhaps the best of both
worlds might be a simple java rounder function available to the xslt.
Ideally would be a parameter to ogr2ogr but we probably don't want our
own custom binary.

Bob

It also seems that there the import process puts a bit too many
brackets on points:
Example point representation (which also has a ridiculous tail of decimals):
[[[[37.270000000000003,-0.69]]]]

Example full polygon representation (gets very big when adding
hundreds of polygons into a layer):
[[[[35.241617396557501,-1.042755167363498],[35.082178302163747,-0.910721897610392],[35.016946729972211,-0.895020643910023],[35.011665399182093,-0.885742630359804],[35.024369140812389,-0.87746378749961],[35.019944242042286,-0.853483690939046],[35.045208986632879,-0.856766680349123],[35.060339285653235,-0.839352562608713],[35.077610664723643,-0.860192408429204],[35.085033075563814,-0.846061280098871],[35.139987463515105,-0.881032254249694],[35.136418996765023,-0.843206506698804],[35.168249720175773,-0.842064597338777],[35.172674618945877,-0.792391540177609],[35.202078784966567,-0.793390710867632],[35.231625689657264,-0.819083671468237],[35.262599981047991,-0.804667065797898],[35.297285477858807,-0.830645503738509],[35.334540270729683,-0.814087818018119],[35.343961022949905,-0.790393198797562],[35.27601741602831,-0.724162455916004],[35.242759305917524,-0.747857075136561],[35.241331919217494,-0.694758289895313],[35.262029026367976,-0.680056206884967],[35.291861408398681,-0.655076939634379],[35.387496317300929,-0.65935909973448],[35.397773501541174,-0.647940006134211],[35.420611688741708,-0.729301048036125],[35.481989316843155,-0.77797493450727],[35.539513000854505,-0.798101086977743],[35.434029123722027,-0.895020643910023],[35.408478901791426,-0.957111965361483],[35.360233231330291,-0.98123480059205],[35.346673057679972,-0.973812389751876],[35.241617396557501,-1.042755167363498]]]]

_______________________________________________
Mailing list: https://launchpad.net/~dhis2-devs
Post to : dhis2-devs@lists.launchpad.net
Unsubscribe : https://launchpad.net/~dhis2-devs
More help : https://help.launchpad.net/ListHelp

--
Cheers,
Knut Staring

--
Cheers,
Knut Staring

_______________________________________________
Mailing list: https://launchpad.net/~dhis2-devs
Post to : dhis2-devs@lists.launchpad.net
Unsubscribe : https://launchpad.net/~dhis2-devs
More help : https://help.launchpad.net/ListHelp

--
Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+17069260025

--
Cheers,
Knut Staring

Hi Knut,

It may be that we want to use DHIS as both a repository with full
precision (though not ridiculously artifical ones like 15 decimal
lat/lon) and have a faster way of renderin. But for a repo, I think
something like PostGIS is in order. Or we could just store things as
GML...

Well, this is really the issue. If DHIS is going to be a repository,
any self-respecting GIS geek would not use it if the application
clipped precision. Although a few meters is not significant in terms
of rendering a map, it may cause havoc on certain datasets,
particularly if there are topological relationships between different
layers. If a facility is related topologically to a road network, and
the point is shifted a few meters, this may result in disturbance of
the topology between these layers, rendering DHIS useless as a
repository. ogr2ogr is perfectly OK as long as we are not dealing with
these types of layers, but as soon as we start to think about
relationships to other layers, we need to be very careful about how
the data is preprocessed.

We should be very conscient of not pushing the new, very simple
solution too far, for more complex functionality we should rather
employ Geoserver and PostGIS - and I still think this is the best
solution for a national repository. Our new way of storing orgunit
boundaries is a very small subset of such a full blown GIS solution,
but has the advantage of being simple, lightweight and portable.

Agreed on both points, namely that the solution is lightweight and
aimed at thematic mapping but other solutions would be more
appropriate for use as a repository of GIS data.

Regards,
Jason

···

--
Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+17069260025

Hi Jason

Hi Knut,

It may be that we want to use DHIS as both a repository with full
precision (though not ridiculously artifical ones like 15 decimal
lat/lon) and have a faster way of renderin. But for a repo, I think
something like PostGIS is in order. Or we could just store things as
GML...

Well, this is really the issue. If DHIS is going to be a repository,
any self-respecting GIS geek would not use it if the application
clipped precision. Although a few meters is not significant in terms
of rendering a map, it may cause havoc on certain datasets,
particularly if there are topological relationships between different
layers. If a facility is related topologically to a road network, and
the point is shifted a few meters, this may result in disturbance of
the topology between these layers, rendering DHIS useless as a
repository. ogr2ogr is perfectly OK as long as we are not dealing with
these types of layers, but as soon as we start to think about
relationships to other layers, we need to be very careful about how
the data is preprocessed.

Would you suggest then that the best place to clip precision would be
when the data is retrieved from the database for the specific view/map
rendering, rather than prior to it being stored?

This would render the current convenience of storing as a geojson
string redundant as we would need to process the string on checkout
anyway.

Can anyone say what the precision is on the shapefiles prior to
ogr2ogr conversion ie. are we introducing a new level of precision
here or is that 15 digit precision the precision of the source
shapefiles?

Bob

···

On 26 July 2010 04:49, Jason Pickering <jason.p.pickering@gmail.com> wrote:

We should be very conscient of not pushing the new, very simple
solution too far, for more complex functionality we should rather
employ Geoserver and PostGIS - and I still think this is the best
solution for a national repository. Our new way of storing orgunit
boundaries is a very small subset of such a full blown GIS solution,
but has the advantage of being simple, lightweight and portable.

Agreed on both points, namely that the solution is lightweight and
aimed at thematic mapping but other solutions would be more
appropriate for use as a repository of GIS data.

Regards,
Jason

--
Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+17069260025

Hi Jason

Hi Knut,

It may be that we want to use DHIS as both a repository with full
precision (though not ridiculously artifical ones like 15 decimal
lat/lon) and have a faster way of renderin. But for a repo, I think
something like PostGIS is in order. Or we could just store things as
GML...

Well, this is really the issue. If DHIS is going to be a repository,
any self-respecting GIS geek would not use it if the application
clipped precision. Although a few meters is not significant in terms
of rendering a map, it may cause havoc on certain datasets,
particularly if there are topological relationships between different
layers. If a facility is related topologically to a road network, and
the point is shifted a few meters, this may result in disturbance of
the topology between these layers, rendering DHIS useless as a
repository. ogr2ogr is perfectly OK as long as we are not dealing with
these types of layers, but as soon as we start to think about
relationships to other layers, we need to be very careful about how
the data is preprocessed.

Would you suggest then that the best place to clip precision would be
when the data is retrieved from the database for the specific view/map
rendering, rather than prior to it being stored?

This would render the current convenience of storing as a geojson
string redundant as we would need to process the string on checkout
anyway.

Can anyone say what the precision is on the shapefiles prior to
ogr2ogr conversion ie. are we introducing a new level of precision
here or is that 15 digit precision the precision of the source
shapefiles?

Quoting myself:

"Here is a comparison of what I get in GeoJSON vs GML (converting from the same
shapefile):
GeoJSON: 38.415412, 1.750212
GML: 38.415411724082148,1.750212388592194"

Both using ogr2ogr. So 6 vs 15 decimals.

Knut

···

On Mon, Jul 26, 2010 at 9:38 AM, Bob Jolliffe <bobjolliffe@gmail.com> wrote:

On 26 July 2010 04:49, Jason Pickering <jason.p.pickering@gmail.com> wrote:

Bob

We should be very conscient of not pushing the new, very simple
solution too far, for more complex functionality we should rather
employ Geoserver and PostGIS - and I still think this is the best
solution for a national repository. Our new way of storing orgunit
boundaries is a very small subset of such a full blown GIS solution,
but has the advantage of being simple, lightweight and portable.

Agreed on both points, namely that the solution is lightweight and
aimed at thematic mapping but other solutions would be more
appropriate for use as a repository of GIS data.

Regards,
Jason

--
Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+17069260025

--
Cheers,
Knut Staring

Hi all,

The number of decimals is not really the issue. If you use 6 decimals, it is already enough for the type of GIS application we are interested in. The use of 15 decimals will not change a lot the precision of your map and it is not really necessary.
0 decimal places = approx. 112 km (70 miles) (Precision depending on the latitude)
3 decimal places = approx 111 m (365 feet)
6 decimal places = < 0.3 m (< 1 foot)

The maps used by the system are not that accurate anyway to be more precise than 6 decimal places because there are not very large scale maps (1:1 000 or 1:500). There are medium scale maps 1:50 000 or 1:100 000 or small scale maps.

The issue is more the cartographic generalization and the fact that it is not preserving all intricate geographical or other cartographic details. It is necessary to run the generalization process in order to use the GeoJSON format, but it removes a lot of data and simplifies it as well. As a significant amount of data is lost in the process, the output files are not relevant regarding purpose and scale and the simplified GeoJSON files can't really be used in a GIS.

Johan

···

-----Original Message-----
From: dhis2-devs-bounces+lemarchandjo=who.int@lists.launchpad.net [mailto:dhis2-devs-bounces+lemarchandjo=who.int@lists.launchpad.net] On Behalf Of Knut Staring
Sent: 26 July 2010 10:17
To: Bob Jolliffe
Cc: dhis2-devs@lists.launchpad.net
Subject: Re: [Dhis2-devs] Info on GIS development

On Mon, Jul 26, 2010 at 9:38 AM, Bob Jolliffe <bobjolliffe@gmail.com> wrote:

Hi Jason

On 26 July 2010 04:49, Jason Pickering <jason.p.pickering@gmail.com> wrote:

Hi Knut,

It may be that we want to use DHIS as both a repository with full
precision (though not ridiculously artifical ones like 15 decimal
lat/lon) and have a faster way of renderin. But for a repo, I think
something like PostGIS is in order. Or we could just store things as
GML...

Well, this is really the issue. If DHIS is going to be a repository,
any self-respecting GIS geek would not use it if the application
clipped precision. Although a few meters is not significant in terms
of rendering a map, it may cause havoc on certain datasets,
particularly if there are topological relationships between different
layers. If a facility is related topologically to a road network, and
the point is shifted a few meters, this may result in disturbance of
the topology between these layers, rendering DHIS useless as a
repository. ogr2ogr is perfectly OK as long as we are not dealing with
these types of layers, but as soon as we start to think about
relationships to other layers, we need to be very careful about how
the data is preprocessed.

Would you suggest then that the best place to clip precision would be
when the data is retrieved from the database for the specific view/map
rendering, rather than prior to it being stored?

This would render the current convenience of storing as a geojson
string redundant as we would need to process the string on checkout
anyway.

Can anyone say what the precision is on the shapefiles prior to
ogr2ogr conversion ie. are we introducing a new level of precision
here or is that 15 digit precision the precision of the source
shapefiles?

Quoting myself:

"Here is a comparison of what I get in GeoJSON vs GML (converting from the same
shapefile):
GeoJSON: 38.415412, 1.750212
GML: 38.415411724082148,1.750212388592194"

Both using ogr2ogr. So 6 vs 15 decimals.

Knut

Bob

We should be very conscient of not pushing the new, very simple
solution too far, for more complex functionality we should rather
employ Geoserver and PostGIS - and I still think this is the best
solution for a national repository. Our new way of storing orgunit
boundaries is a very small subset of such a full blown GIS solution,
but has the advantage of being simple, lightweight and portable.

Agreed on both points, namely that the solution is lightweight and
aimed at thematic mapping but other solutions would be more
appropriate for use as a repository of GIS data.

Regards,
Jason

--
Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+17069260025

--
Cheers,
Knut Staring

_______________________________________________
Mailing list: https://launchpad.net/~dhis2-devs
Post to : dhis2-devs@lists.launchpad.net
Unsubscribe : https://launchpad.net/~dhis2-devs
More help : https://help.launchpad.net/ListHelp

Hi Johan and Bob,

Johan, you are indeed correct that the generalization process may
remove the "cartographic intricacies", but this is very likely because
the generalization is performed either on geographical data where
there is no topological relationships between objects, or the
generalization process does not respect the topology when it is
performed.

It would be possible to generalize a given set of polygons without
affecting their intrinsic topological relationships, but much more
care needs to be exercised when the generalization is performed. This
generalization could take place by removing unnecessary points
(simplification) and/or by reducing the precision of the data.

Ultimately the point in doing this is to decrease the "bulk" of the
data that is presented to the client. I can imagine that a data set of
100 points with 15 decimals would behave more or less the same as a
dataset of 1000 points with 6 decimals (just guessing here). My point
is that there is a certain payload associated with each dataset.
Typically, server side processing in the form of processing the GIS
layer to an image would be employed. However, since we are using
vector data on the client side, the data should be preprocessed in
order to preserve these cartographic details that are important, as
automated simplification routines normally do not handle this. The
result being that the payload of the layer has been decreased to a
point that is "acceptable" to users. I am sceptical about whether this
step will be possible to automate at all for reasonably complex
polgyon layers (i.e districts) that DHIS typically deals with.

I want to come back to the use of of DHIS as a repository. At this
point, IMHO, DHIS seems to be not appropriate for a health facility
repository. There is no way to adjust the metadata of a given
organizational unit object easily. I suppose we could use things like
orgunit groups to provide some type of metadata, but for instance, we
may want each orgunit to have a property such as "Address", "Fax" or
"Elevation". Additionally the proposed clipping of precision further
complicates matters in this regard. Ultimatley, we want a quick
responsive map for users as the first priority, and we should set our
sites on this.

In summary, I think that the current approach that we have, namely a
recommended workflow of how to preprocess a given set of data should
not be supplanted by the system itself truncating precision of
coordinates. There are many different generalization algorithms, each
with their pros and cons. Additionally, the generalization is highly
dependent on the scale of the map, and ultimately the pixel size of
the users screen, implying that different datasets may need to be
generalized in different ways depending on their scale. A gory detail
of how this done by Geoserver (using Geotools) is here
http://docs.geoserver.org/stable/en/user/tutorials/feature-pregeneralized/feature-pregeneralized_tutorial.html\.
  We certainly do not need to recreate GeoTools or Geoserver, as they
are very good already at what they do. I would say instead that we
should consider leveraging these tools instead, and letting them
decide how to generalize or not generalize features, depending on the
scale of the map that is requested by the users. I guess I am
expressing some fundamental gripe, that we should not baby users too
much. If people want to have 15 decimals, well let them. They may have
reasons for this. It obviously does not make much sense, any more than
using 50,000 points to represent a simple polygon that could be
represented with four vertices. In both cases, the GIS guys need to do
their work and understand what type of data is required by the client.
Providing clear recommendations for a workflow coupled with
guidelines on what a "reasonable" payload to the browser would be,
e.g. 30kb versus 30MB for a given layer, would be the best way to go I
think.

Regards,
Jason

···

On Mon, Jul 26, 2010 at 2:11 PM, Lemarchand, Johan <lemarchandjo@who.int> wrote:

Hi all,

The number of decimals is not really the issue. If you use 6 decimals, it is already enough for the type of GIS application we are interested in. The use of 15 decimals will not change a lot the precision of your map and it is not really necessary.
0 decimal places = approx. 112 km (70 miles) (Precision depending on the latitude)
3 decimal places = approx 111 m (365 feet)
6 decimal places = < 0.3 m (< 1 foot)

The maps used by the system are not that accurate anyway to be more precise than 6 decimal places because there are not very large scale maps (1:1 000 or 1:500). There are medium scale maps 1:50 000 or 1:100 000 or small scale maps.

The issue is more the cartographic generalization and the fact that it is not preserving all intricate geographical or other cartographic details. It is necessary to run the generalization process in order to use the GeoJSON format, but it removes a lot of data and simplifies it as well. As a significant amount of data is lost in the process, the output files are not relevant regarding purpose and scale and the simplified GeoJSON files can't really be used in a GIS.

Johan

-----Original Message-----
From: dhis2-devs-bounces+lemarchandjo=who.int@lists.launchpad.net [mailto:dhis2-devs-bounces+lemarchandjo=who.int@lists.launchpad.net] On Behalf Of Knut Staring
Sent: 26 July 2010 10:17
To: Bob Jolliffe
Cc: dhis2-devs@lists.launchpad.net
Subject: Re: [Dhis2-devs] Info on GIS development

On Mon, Jul 26, 2010 at 9:38 AM, Bob Jolliffe <bobjolliffe@gmail.com> wrote:

Hi Jason

On 26 July 2010 04:49, Jason Pickering <jason.p.pickering@gmail.com> wrote:

Hi Knut,

It may be that we want to use DHIS as both a repository with full
precision (though not ridiculously artifical ones like 15 decimal
lat/lon) and have a faster way of renderin. But for a repo, I think
something like PostGIS is in order. Or we could just store things as
GML...

Well, this is really the issue. If DHIS is going to be a repository,
any self-respecting GIS geek would not use it if the application
clipped precision. Although a few meters is not significant in terms
of rendering a map, it may cause havoc on certain datasets,
particularly if there are topological relationships between different
layers. If a facility is related topologically to a road network, and
the point is shifted a few meters, this may result in disturbance of
the topology between these layers, rendering DHIS useless as a
repository. ogr2ogr is perfectly OK as long as we are not dealing with
these types of layers, but as soon as we start to think about
relationships to other layers, we need to be very careful about how
the data is preprocessed.

Would you suggest then that the best place to clip precision would be
when the data is retrieved from the database for the specific view/map
rendering, rather than prior to it being stored?

This would render the current convenience of storing as a geojson
string redundant as we would need to process the string on checkout
anyway.

Can anyone say what the precision is on the shapefiles prior to
ogr2ogr conversion ie. are we introducing a new level of precision
here or is that 15 digit precision the precision of the source
shapefiles?

Quoting myself:

"Here is a comparison of what I get in GeoJSON vs GML (converting from the same
shapefile):
GeoJSON: 38.415412, 1.750212
GML: 38.415411724082148,1.750212388592194"

Both using ogr2ogr. So 6 vs 15 decimals.

Knut

Bob

We should be very conscient of not pushing the new, very simple
solution too far, for more complex functionality we should rather
employ Geoserver and PostGIS - and I still think this is the best
solution for a national repository. Our new way of storing orgunit
boundaries is a very small subset of such a full blown GIS solution,
but has the advantage of being simple, lightweight and portable.

Agreed on both points, namely that the solution is lightweight and
aimed at thematic mapping but other solutions would be more
appropriate for use as a repository of GIS data.

Regards,
Jason

--
Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+17069260025

--
Cheers,
Knut Staring

_______________________________________________
Mailing list: https://launchpad.net/~dhis2-devs
Post to : dhis2-devs@lists.launchpad.net
Unsubscribe : https://launchpad.net/~dhis2-devs
More help : https://help.launchpad.net/ListHelp

_______________________________________________
Mailing list: https://launchpad.net/~dhis2-devs
Post to : dhis2-devs@lists.launchpad.net
Unsubscribe : https://launchpad.net/~dhis2-devs
More help : https://help.launchpad.net/ListHelp

--
Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+17069260025

Hi Jason,

I am not aware of very advanced topological relationships when GIS is used for Public Health. I have seen much more advanced implementation in GIS solutions for telecommunications or utility.
In the context of the DHIS, the main spatial relationship I can think of is to answer the question how many health facilities are contained within this district?

The generalization which is currently performed simplifies a lot the data and I don't think it will be possible to preserve the topology using the same parameters. If you use a better generalizing algorithm which will keep relationships among spatial objects, the size of the geoJSON will be much bigger. At this point, I don't think it is relevant to talk about topological relationships or scale when the original layers are generalized with the currently specified tolerance.

Johan

···

-----Original Message-----
From: Jason Pickering [mailto:jason.p.pickering@gmail.com]
Sent: 26 July 2010 11:53
To: Lemarchand, Johan
Cc: Knut Staring; Bob Jolliffe; dhis2-devs@lists.launchpad.net
Subject: Re: [Dhis2-devs] Info on GIS development

Hi Johan and Bob,

Johan, you are indeed correct that the generalization process may
remove the "cartographic intricacies", but this is very likely because
the generalization is performed either on geographical data where
there is no topological relationships between objects, or the
generalization process does not respect the topology when it is
performed.

It would be possible to generalize a given set of polygons without
affecting their intrinsic topological relationships, but much more
care needs to be exercised when the generalization is performed. This
generalization could take place by removing unnecessary points
(simplification) and/or by reducing the precision of the data.

Ultimately the point in doing this is to decrease the "bulk" of the
data that is presented to the client. I can imagine that a data set of
100 points with 15 decimals would behave more or less the same as a
dataset of 1000 points with 6 decimals (just guessing here). My point
is that there is a certain payload associated with each dataset.
Typically, server side processing in the form of processing the GIS
layer to an image would be employed. However, since we are using
vector data on the client side, the data should be preprocessed in
order to preserve these cartographic details that are important, as
automated simplification routines normally do not handle this. The
result being that the payload of the layer has been decreased to a
point that is "acceptable" to users. I am sceptical about whether this
step will be possible to automate at all for reasonably complex
polgyon layers (i.e districts) that DHIS typically deals with.

I want to come back to the use of of DHIS as a repository. At this
point, IMHO, DHIS seems to be not appropriate for a health facility
repository. There is no way to adjust the metadata of a given
organizational unit object easily. I suppose we could use things like
orgunit groups to provide some type of metadata, but for instance, we
may want each orgunit to have a property such as "Address", "Fax" or
"Elevation". Additionally the proposed clipping of precision further
complicates matters in this regard. Ultimatley, we want a quick
responsive map for users as the first priority, and we should set our
sites on this.

In summary, I think that the current approach that we have, namely a
recommended workflow of how to preprocess a given set of data should
not be supplanted by the system itself truncating precision of
coordinates. There are many different generalization algorithms, each
with their pros and cons. Additionally, the generalization is highly
dependent on the scale of the map, and ultimately the pixel size of
the users screen, implying that different datasets may need to be
generalized in different ways depending on their scale. A gory detail
of how this done by Geoserver (using Geotools) is here
http://docs.geoserver.org/stable/en/user/tutorials/feature-pregeneralized/feature-pregeneralized_tutorial.html\.
  We certainly do not need to recreate GeoTools or Geoserver, as they
are very good already at what they do. I would say instead that we
should consider leveraging these tools instead, and letting them
decide how to generalize or not generalize features, depending on the
scale of the map that is requested by the users. I guess I am
expressing some fundamental gripe, that we should not baby users too
much. If people want to have 15 decimals, well let them. They may have
reasons for this. It obviously does not make much sense, any more than
using 50,000 points to represent a simple polygon that could be
represented with four vertices. In both cases, the GIS guys need to do
their work and understand what type of data is required by the client.
Providing clear recommendations for a workflow coupled with
guidelines on what a "reasonable" payload to the browser would be,
e.g. 30kb versus 30MB for a given layer, would be the best way to go I
think.

Regards,
Jason

On Mon, Jul 26, 2010 at 2:11 PM, Lemarchand, Johan <lemarchandjo@who.int> wrote:

Hi all,

The number of decimals is not really the issue. If you use 6 decimals, it is already enough for the type of GIS application we are interested in. The use of 15 decimals will not change a lot the precision of your map and it is not really necessary.
0 decimal places = approx. 112 km (70 miles) (Precision depending on the latitude)
3 decimal places = approx 111 m (365 feet)
6 decimal places = < 0.3 m (< 1 foot)

The maps used by the system are not that accurate anyway to be more precise than 6 decimal places because there are not very large scale maps (1:1 000 or 1:500). There are medium scale maps 1:50 000 or 1:100 000 or small scale maps.

The issue is more the cartographic generalization and the fact that it is not preserving all intricate geographical or other cartographic details. It is necessary to run the generalization process in order to use the GeoJSON format, but it removes a lot of data and simplifies it as well. As a significant amount of data is lost in the process, the output files are not relevant regarding purpose and scale and the simplified GeoJSON files can't really be used in a GIS.

Johan

-----Original Message-----
From: dhis2-devs-bounces+lemarchandjo=who.int@lists.launchpad.net [mailto:dhis2-devs-bounces+lemarchandjo=who.int@lists.launchpad.net] On Behalf Of Knut Staring
Sent: 26 July 2010 10:17
To: Bob Jolliffe
Cc: dhis2-devs@lists.launchpad.net
Subject: Re: [Dhis2-devs] Info on GIS development

On Mon, Jul 26, 2010 at 9:38 AM, Bob Jolliffe <bobjolliffe@gmail.com> wrote:

Hi Jason

On 26 July 2010 04:49, Jason Pickering <jason.p.pickering@gmail.com> wrote:

Hi Knut,

It may be that we want to use DHIS as both a repository with full
precision (though not ridiculously artifical ones like 15 decimal
lat/lon) and have a faster way of renderin. But for a repo, I think
something like PostGIS is in order. Or we could just store things as
GML...

Well, this is really the issue. If DHIS is going to be a repository,
any self-respecting GIS geek would not use it if the application
clipped precision. Although a few meters is not significant in terms
of rendering a map, it may cause havoc on certain datasets,
particularly if there are topological relationships between different
layers. If a facility is related topologically to a road network, and
the point is shifted a few meters, this may result in disturbance of
the topology between these layers, rendering DHIS useless as a
repository. ogr2ogr is perfectly OK as long as we are not dealing with
these types of layers, but as soon as we start to think about
relationships to other layers, we need to be very careful about how
the data is preprocessed.

Would you suggest then that the best place to clip precision would be
when the data is retrieved from the database for the specific view/map
rendering, rather than prior to it being stored?

This would render the current convenience of storing as a geojson
string redundant as we would need to process the string on checkout
anyway.

Can anyone say what the precision is on the shapefiles prior to
ogr2ogr conversion ie. are we introducing a new level of precision
here or is that 15 digit precision the precision of the source
shapefiles?

Quoting myself:

"Here is a comparison of what I get in GeoJSON vs GML (converting from the same
shapefile):
GeoJSON: 38.415412, 1.750212
GML: 38.415411724082148,1.750212388592194"

Both using ogr2ogr. So 6 vs 15 decimals.

Knut

Bob

We should be very conscient of not pushing the new, very simple
solution too far, for more complex functionality we should rather
employ Geoserver and PostGIS - and I still think this is the best
solution for a national repository. Our new way of storing orgunit
boundaries is a very small subset of such a full blown GIS solution,
but has the advantage of being simple, lightweight and portable.

Agreed on both points, namely that the solution is lightweight and
aimed at thematic mapping but other solutions would be more
appropriate for use as a repository of GIS data.

Regards,
Jason

--
Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+17069260025

--
Cheers,
Knut Staring

_______________________________________________
Mailing list: https://launchpad.net/~dhis2-devs
Post to : dhis2-devs@lists.launchpad.net
Unsubscribe : https://launchpad.net/~dhis2-devs
More help : https://help.launchpad.net/ListHelp

_______________________________________________
Mailing list: https://launchpad.net/~dhis2-devs
Post to : dhis2-devs@lists.launchpad.net
Unsubscribe : https://launchpad.net/~dhis2-devs
More help : https://help.launchpad.net/ListHelp

--
Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+17069260025

Hi Knut

Spotted this:
http://trac.osgeo.org/gdal/browser/trunk/autotest/ogr/ogr_gml_geom.py?rev=20065

gml_out_precision is failing on win32.

See line 517 [def gml_out_precision(): ]

Can you repeat on something other than windoze and confirm the same
problem exists.

Cheers
Bob

···

On 26 July 2010 09:17, Knut Staring <knutst@gmail.com> wrote:

On Mon, Jul 26, 2010 at 9:38 AM, Bob Jolliffe <bobjolliffe@gmail.com> wrote:

Hi Jason

On 26 July 2010 04:49, Jason Pickering <jason.p.pickering@gmail.com> wrote:

Hi Knut,

It may be that we want to use DHIS as both a repository with full
precision (though not ridiculously artifical ones like 15 decimal
lat/lon) and have a faster way of renderin. But for a repo, I think
something like PostGIS is in order. Or we could just store things as
GML...

Well, this is really the issue. If DHIS is going to be a repository,
any self-respecting GIS geek would not use it if the application
clipped precision. Although a few meters is not significant in terms
of rendering a map, it may cause havoc on certain datasets,
particularly if there are topological relationships between different
layers. If a facility is related topologically to a road network, and
the point is shifted a few meters, this may result in disturbance of
the topology between these layers, rendering DHIS useless as a
repository. ogr2ogr is perfectly OK as long as we are not dealing with
these types of layers, but as soon as we start to think about
relationships to other layers, we need to be very careful about how
the data is preprocessed.

Would you suggest then that the best place to clip precision would be
when the data is retrieved from the database for the specific view/map
rendering, rather than prior to it being stored?

This would render the current convenience of storing as a geojson
string redundant as we would need to process the string on checkout
anyway.

Can anyone say what the precision is on the shapefiles prior to
ogr2ogr conversion ie. are we introducing a new level of precision
here or is that 15 digit precision the precision of the source
shapefiles?

Quoting myself:

"Here is a comparison of what I get in GeoJSON vs GML (converting from the same
shapefile):
GeoJSON: 38.415412, 1.750212
GML: 38.415411724082148,1.750212388592194"

Both using ogr2ogr. So 6 vs 15 decimals.

Knut

Bob

We should be very conscient of not pushing the new, very simple
solution too far, for more complex functionality we should rather
employ Geoserver and PostGIS - and I still think this is the best
solution for a national repository. Our new way of storing orgunit
boundaries is a very small subset of such a full blown GIS solution,
but has the advantage of being simple, lightweight and portable.

Agreed on both points, namely that the solution is lightweight and
aimed at thematic mapping but other solutions would be more
appropriate for use as a repository of GIS data.

Regards,
Jason

--
Jason P. Pickering
email: jason.p.pickering@gmail.com
tel:+17069260025

--
Cheers,
Knut Staring