Basemap Generation

These are simply my experiences and are put here as method of documenting what I did. Hopefully it will be of use to someone else. The generated maps are suitable for training purposes and can be used as a starting point for competition maps. If you find anything that could be improved upon, please let me know at

Data Sources in Canada

NTDB/BNDT Files Definition

Each tile contains a number of shapefiles in a zip. Here is a list of each of the file and what they contain:

Filename ISOM Code Equivalent Entity Name
a_cable_l 516 AERIAL CABLEWAY
bridge_l 518 BRIDGE
buildin_a 526 BUILDING
buildin_p 526.1 BUILDING
builtup_a None BUILT-UP AREA
campgro_a 527 CAMPGROUND
cemeter_a 527 CEMETERY
chimney_p None CHIMNEY
contour_l 501 CONTOUR
cut_a 403 CUT
cut_lin_l 414 CUT LINE
dam_p 314 DAM
drivein_a 527 DRIVE-IN THEATRE
elev_pt_p None ELEVATION POINT
embankm_a None EMBANKMENT
golf_co_a 527 GOLF COURSE
li_depo_a 527 LIQUIDS DEPOT/DUMP
li_road_l 505 LIMITED-USE ROAD
mininga_a 527 MINING AREA
park_sp_a 401 PARK/SPORTS FIELD
pipelin_l 533 PIPELINE
railway_l 515 RAILWAY
road_l Various ROAD
runway_a 529.2 RUNWAY
sand_a 211 SAND
silo_p 535 SILO
tank_p 540 TANK
toponym_p None TOPONYM
tower_p 535 TOWER
trail_l 507 TRAIL
vegetat_a None VEGETATION
water_b_a 301 WATERBODY
water_c_l 305 WATERCOURSE
wetland_a 310 WETLAND
wharf_l 503 WHARF

Computer Software

Software Used by Me

OpenOrienteering Mapper (OOM)
An open-source OCAD replacement that runs on Windows, Mac OS X, Linux, and Android. Used for making maps.
An open-source library that is used for convert between various GIS formats (both vector and raster files). It is very powerful, but also has a steep learning curve. It runs on Windows and Linux for sure…
A perl-based generator of orienteering training maps from LiDAR data. In addition to outputting an image, it also generates contours. Runs only on Windows, but can be converted to run on Linux (more on this below).
A series of tools for manipulating and compressing LiDAR data. All of the tools run on Windows, a few of them are available for Linux as well. The free version of some of the tools does do some distortion if you have a high number of points

Other Useful Software

Terje Mathisen’s Perl Scripts for Processing Lidar
A series of Perl scripts (for Windows) to generate contours and other basemap data that uses LASTools. No Linux support.
A Free and Open Source Geographic Information System (aka ArcGIS equivalent). Runs on Windows, Linux, Mac OS X, BSD, and Android. Useful for quickly visualizing different data sources together.
Purple Pen
Free course setting (Condes equivalent) for Windows.

Ubuntu 16.04 Setup

As I use Linux as my main OS, I will detail how I set up my working environment. It presumes basic knowledge of using the Terminal. Windows users will have to make a few modifications or setup a virtual machine using VMWare or VirtualBox.

  • Install OpenOrienteering Mapper as per the instructions on their website. At the time of writing (April 2017), use the “unstable” builds (version 0.7.x) as the 0.6.x builds don’t have sufficient support for importing DXF and other file types. Once installed, open the program, hit File->Settings, click on GDAL/OGR, and make sure DXF is selected. This will ensure that you can import the data as orienteering specific symbols as opposed to generic features.
  • Download LASTools and compile the Linux binaries based on the instructions in the README file. Optionally, but recommended, is to add the path of the built tools to your environmental variable PATH (eg run the command export PATH=$PATH:/path/to/lastools/bin and/or add it to your .bashrc).
  • Install GDAL with the command sudo apt install gdal-bin
  • Download Karttapullautin. Extract the zip file, then open the pullauta.exe with Ubuntu’s Archive Manager. Extract the file which is located in the scripts folder. Edit replacing all instances of “pullauta” with “perl” and all instances of “\\” with “\/” (this is for correct path names on Linux). You will likely need to install some of the required Perl modules, for me this was the GD and Geo::ShapeFile modules. You can install the GD module with sudo apt install libgd-perl. For Geo::Shapefile, you can download the source and compile it from here or you can use cpanm to install it (run sudo apt install cpanminus && sudo cpanm Geo::ShapeFile). Try running the script with the command perl – if it complains of any missing modules, Google around to figure out how to install them.

Getting Started – Projections!

Choose a projection! It is strongly recommended to have everything in the same projection for ease of use. I recommend using the UTM zone that you are located in. For me, in BC, this is UTM11N, which has an EPSG code of 32611.


If you’re lucky, the lidar files (laz/las) files that you’re using specify the projection in the header file. If you’re not, then ask whoever provided the data what projection they used. You can view the info of lidar file using the “lasinfo” command – run lasinfo -cd FILENAME.las. The output should be similar to

reporting all LAS header entries:
  file signature:             'LASF'
  file source ID:             0
  global_encoding:            0
  project ID GUID data 1-4:   00000000-0000-0000-0000-000000000000
  version major.minor:        1.0
  system identifier:          'LAStools (c) by rapidlasso GmbH'
  generating software:        'las2las (version 160710)'
  file creation day/year:     335/2016
  header size:                227
  offset to point data:       323
  number var. length records: 1
  point data format:          1
  point data record length:   28
  number of point records:    160238229
  number of points by return: 93148464 49521079 15049536 2312070 196781
  scale factor x y z:         0.01 0.01 0.01
  offset x y z:               300000 5600000 0
  min x y z:                  332976.72 5637138.21 345.65
  max x y z:                  336448.80 5641391.34 892.32
variable length header record 1 of 1:
  reserved             0
  user ID              'LASF_Projection'
  record ID            34735
  length after header  40
  description          'by LAStools of rapidlasso GmbH'
    GeoKeyDirectoryTag version 1.1.0 number of keys 4
      key 1024 tiff_tag_location 0 count 1 value_offset 1 - GTModelTypeGeoKey: ModelTypeProjected
      key 3072 tiff_tag_location 0 count 1 value_offset 26911 - ProjectedCSTypeGeoKey: NAD83 / UTM 11N
      key 3076 tiff_tag_location 0 count 1 value_offset 9001 - ProjLinearUnitsGeoKey: Linear_Meter
      key 4099 tiff_tag_location 0 count 1 value_offset 9001 - VerticalUnitsGeoKey: Linear_Meter
the header is followed by 2 user-defined bytes
LASzip compression (version 2.4r1 c2 50000): POINT10 2 GPSTIME11 2
reporting minimum and maximum for all LAS point record entries ...
  X             3297672    3644880
  Y             3713821    4139134
  Z               34565      89232
  intensity       21102      65535
  return_number       1          7
  number_of_returns   1          7
  edge_of_flight_line 0          0
  scan_direction_flag 0          0
  classification      1          2
  scan_angle_rank   -25         25
  user_data          64         65
  point_source_ID 26003      56016
  gps_time 158082629.636509 158087772.829990
WARNING: range violates GPS week time specified by global encoding bit 0
number of first returns:        93148464
number of intermediate returns: 17571961
number of last returns:         93140777
number of single returns:       43622973
covered area in square meters/kilometers: 9175776/9.18
point density: all returns 17.46 last only 10.15 (per square meter)
      spacing: all returns 0.24 last only 0.31 (in meters)
WARNING: there are 9992 points with return number 6
WARNING: there are 307 points with return number 7
overview over number of returns of given pulse: 43622973 68930770 38223995 8465489 934709 58180 2113
histogram of classification of points:
       135808509  unclassified (1)
        24429720  ground (2)

From this, you can see that the data is in meters and is in the the NAD83 / UTM 11N projection. As well, the point density (only calculated since we passed the “-cd” parameter to lasinfo) is 17.46 with a minimum of 10.15 – this is relatively high quality lidar, it will often be much lower.

If you have multiple lidar files and/or need to change the projection, you can use the “las2las” command to make one file with the correct projection. For example, you can run las2las -i LAZ/*.laz -merged -o out.laz -epsg 3005 -target_utm 11N where your files are located in the LAZ folder, you’re converting from a projection with an EPSG code of 3005 and your target projection is UTM zone 11N.


You can convert the projection of the shapefiles using the “ogr2ogr” command which is part of GDAL. Use the command ogr2ogr -skipfailures -f "ESRI Shapefile" 2007-BNDT-CONVERTED 2007-BNDT-SOURCE -t_srs EPSG:32611 where 2007-BNDT-CONVERTED is the folder you want the data outputed to, 2007-BNDT-SOURCE is the folder containing the unzipped NTDB/BNDT data, and EPSG:32611 is the EPSG code of the target project (WGS 84 / UTM zone 11N in this case).


Basic Usage

Karttapullautin is controlled via the pullautin.ini file. In general, the default settings work fairly well. However, you’ll want to change the “northlinesangle” parameter to whatever the magnetic declination is at your map (you can calculate it here. If your lidar data has a high number of points, you can change the “thinfactor” to something lower to speed up the process. Typically everything 2 points/m2 and higher gives the same output – so calculate this factor based on the output of the lasinfo command above.

Karttapullautin itself is run with the command perl input.laz on Linux or pullauta.exe input.laz on Windows. It then parses the input file and spits out a png image of the map. It saves all of the intermediate files in the temp folder. Here is what each of the files is:

This is the main image output
This is the world file for georeferencing the main image. The coordinate system is whatever the input lidar file was in.
This is the same as the main image output but the depressions are highlighted in pink. This is useful if you’re using this as a template in OCAD or OOM.
This is the world file for georeferencing the depressions image. The coordinate system is whatever the input lidar file was in.
This contains any detected water, if you’ve enabled this feature in pullauta.ini (defaults to off meaning this is a blank file)
PNG image containing the biggest cliffs
DXF file containing the biggest cliffs
DXF file containing the smallest cliffs
DXF file containing 0.3m contours. This file is typically very big – you’ll need a fast computer to render it in OOM or QGIS.
Intermediate file, not very useful
Intermediate file, not very useful
DXF file containing dot knolls and small U depressions
Intermediate file, not very useful
Intermediate file, not very useful
Intermediate file, not very useful
DXF file containing unsmoothed contours at 2.5m, an intermediate file
DXF file containing smoothed contours at 2.5m, with every second one a form line and an index contour every 12.5m
Intermediate file, not very useful
Intermediate file, not very useful
PNG file containing the undergrowth.
This is the world file for georeferencing the undergrowth image. The coordinate system is whatever the input lidar file was in.
PNG file containing the vegetation.
This is the world file for georeferencing the vegetation image. The coordinate system is whatever the input lidar file was in.
Intermediate file related to contour generation, not very useful
Intermediate file to 0.3m contour generation, not very useful
Intermediate file related to knolls, not very useful
Intermediate file generated from the initial lidar file, not very useful

Read the readme.txt for some more info on how to fine-tune the output and how to re-run just the parts you need to.

Adding the NTDB/BNDT Dataset (or Other Data)

Karttapullautin supports drawing certain symbols to the map from shapefiles. It maps these features based on the contents of the file specified by the “vectorconf” parameter in pullauta.ini. The vectorconf file is line-based with three parameters on each line, each separated by the “|” character. The first entry is a human-readable definition of the symbol, the second is the ISOM symbol code to map to (only certain ones are supported), and the third is the a mapping between the attribute table of the shapefile and its values. I’ve created a vectorconf file for the NTDB/BNDT dataset which is available here. I’ve also created one for the main GeoGratis data but haven’t tested it. You can download it here.

Once you’ve edited pullauta.ini to refer to the correct vectorconf, run perl (pullauta.exe on Windows) where is a zip file containing all of the dataset (in the right projection!) that you want to render.

You can also render OpenStreet Map data using the osm.txt vectorconf file provided with Karttapullautin. Note that you need to convert the OSM data you downloaded earlier to shapefiles using ogr2ogr.

Generating Contours at a Different Interval

If you don’t want 2.5m contours, you can filter down to a larger contour interval by copying/renaming the temp folder to temp1 and then running perl 1 xyz2contours 10 null new_contours.dxf (pullauta.exe 1 xyz2contours 10 null new_contours.dxf on Windows) where 10 is the contour interval (I’ve only tested in increments of 2.5m) and new_contours.dxf is your output file.

A Note on Batch Mode and RAM Usage

Karttapullautin has a batch mode where it can process multiple lidar files simultaneously. It is activated by parameters in pullauta.ini. The advantage is multi-threaded processing (which is faster as you’re doing work in parallel) and you can process more data without running into your maximum ram usage. After all the tiles have been processed, you can then merge the output files. If you only want an image and don’t want any of the DXF files for importing into OOM/OCAD, this is the way to go. Unfortunately, the DXF files are merely appended to each other so when you import it into OOM/OCAD, each contour is in a number of sections. Manually merging them is a pain so I recommend not using batch mode if you plan on using the generated contours.

For me, I ran into the upper limit for RAM usage for a 32bit executable on Windows when I was processing my ~10km2 map. That’s the main reason I started using Linux as you can run the 64bit version of perl so you’re no longer limited to 4GB/process. If you’re using the batch mode, then RAM usage is less of a problem as you’re not processing as big of files.

Importing into OOM

Karttapullautin-Generated Data

After you’ve created a georeferenced map in OOM (again, use the same projection as you have all your data in), you can import the cliffs, knolls, and contours. OOM (as does OCAD) uses a CRT (cross reference table) file to import the objects as the right symbol. The format of a CRT file is the ISOM symbol code, followed by a space, followed by the layer code of the DXF file. A CRT file for karttapullautin-generated data can be downloaded here – make sure to remove the .txt extension (original source here). Copy it and rename it to the same name as the the DXF file you wish to import (just change the file extension to .crt, obviously). The in OOM click File->Import and select the file. You should get a prompt for matching symbols, similar to the image here.

Adding the NTDB/BNDT Dataset

OOM doesn’t support importing shapefiles and have them being mapped to the right orienteering symbols. You can either import each of the shapefiles and manually change the symbol set for each. Alternatively, you can convert them to DXF files and import them with a CRT. I’ve created a (Linux) shell script which is available here to convert the data and create the CRT file. It also converts the projection – change the variable at the top of the script to the right EPSG code that you’re using. Then, just import each DXF file in OOM and run with it! Some features don’t map very well into orienteering symbols – they will be imported as generic “Line” or “Area” symbols.

Adding the Data from the GeoGratis Extraction

I’ve created a perl script to convert the shapfiles into DXF files with CRT files suitable for importing into OpenOrienteering mapper. It is located at – complete instructions on how to use it are located there as well.

Other Useful Links

The following links were useful for me in determining the right process to use and may be of interest to other people: