1
0
Fork 0
terragear/README

208 lines
7.2 KiB
Text

TerraGear Scenery Tools README
==============================
TerraGear is a collection of tools for building scenery for the
FlightGear project. Generally, the process is done in two steps:
1. Preprocess the original raw data. This chops up the data into
the FG tiling scheme and saves it in a simple, intermediate
format.
2. Collect all the different pieces of intermediate data and
assemble them into a 3d model of the terrain.
There is currently no graphical front end for these tools so you will
need to run them from the command line. Be prepaired, when building
scenery on a world wide scale, be prepaired to burn through multiple
gigabytes of disk space and days or weeks of crunching. Building
smaller chunks is much more doable though.
Building the Tools
==================
These tools are primarily compiled and tested under Unix with the gnu
compilers. I believe they also build and run on windows with Cygwin.
If anyone has patches for supporting other platforms, I will be happy
to incorporate them.
The process for building these tools is very similar to building the
main FG source code.
1. If you are using the CVS version of the source, run the
"autogen.sh" script. If you downloaded the source tarball, then
don't.
2. Run the "configure" script, with optional arguments for setting
the install prefix, etc..
3. Run "make"
4. Run "make install"
Preprocessing Terrain
=====================
TerraGear supports several terrain data sources:
1. 30-arcsec SRTM based terrain data covering the world (recommended
over other 30-arcsec data sources):
ftp://edcsgs9.cr.usgs.gov/pub/data/srtm/SRTM30/
I don't recall the details at the moment for processing this data.
Probably similar to the processing of the GLOBE data.
2. 30-arcsec world wide data: GLOBE project:
http://www.ngdc.noaa.gov/seg/topo/globe.shtml
a) First convert the "bin" DEM format to "ascii" DEM format using
"Prep/DemRaw2ascii/raw2ascii"
b) Then process the resulting files with "Prep/DemChop/demchop"
3. 30-arcsec world wide data: GTOPO30 data:
http://edcwww.cr.usgs.gov/landdaac/gtopo30/gtopo30.html
a) First convert the "bin" DEM format to "ascii" DEM format using
"Prep/DemRaw2ascii/raw2ascii"
b) Then process the resulting files with "Prep/DemChop/demchop"
4. SRTM (1 and 3-arcsec nearly world wide coverage):
ftp://edcsgs9.cr.usgs.gov/pub/data/srtm/
a) Chop up the .zip files using "Prep/DemChop/hgtchop"
5. 3-arcsec ASCII DEM files:
Generally, I recommend using the SRTM data over this older data
set, however in places like Alaska, there is no SRTM coverage so
this data is better than the 30 arcsec data.
http://edcwww.cr.usgs.gov/doc/edchome/ndcdb/ndcdb.html
a) Create the .arr.gz files using the "Prep/DemChop/demchop" utility.
The result for any of these terrain sources should be a "work" tree
with a .arr.gz file for each FG tile.
6. After you create the .arr.gz files you have to create a
corresponding .fit.gz file for each of these. This is a data
reduction step which fits a set of polygons to the raw terrain with
a set of constraints on the maximum error allowed relative to the
original data set, and a max/min number of allowed nodes in the
fitted set. The provided tools use a scheme that produces an
adaptive fit which means fewer polygons in smooth flat areas, and
more polygons in complex rough areas. The end result is a *much*
better fit with fewer polygons than you could achieve by just keeping
every "nth" point from the original array.
To walk through an entire tree of .arr.gz files and produce the
corresponding .fit.gz files, use the "Prep/TerraFit/terrafit.py"
utility. Please ignore the old "ArrayFit" tools which use a stupid
algorithm and are basically useless in comparison to TerraFit.
You should now have a large tree of .arr.gz files with a corresponding
.fit.gz file for each .arr.gz file. It's worth double checking the
contents of your directory and counting all files of each type to make
sure you do have a one to one match and didn't miss anything.
Generating Airports
===================
Robin Peel maintains a world wide database of airports and navaids for
the X-Plane and FlightGear projects:
http://www.x-plane.org/users/robinp/
Robin's apt.dat needs to be run through two scripts
cat apt.dat | ./xp2simpleapt.pl > basic.dat
cat apt.dat | ./xp2runway.pl > runways.dat
Compress these and copy them to $FG_ROOT/data/Airports
Now run the runways.dat through the getapt utility:
genapts --input=runways.dat --work=$FG_WORK_DIR
Note: this creates a last_apt file which shows you the airport genapts
is currently working on.
- if genapts crashes (which is possible if you try to run through the
entire runways.dat file) you can look at last_apt to see where to start
up again.
- You can start in midstream using the --start-id=KABC option to genapts.
- If you get a consistant crash on a particular airport, you probably
found a bug in genapts, or there is some degenerate information at that
airport (40 mile long runways, 2 runways spaced miles apart, etc.)
Often you can fix the data and proceed. Sometimes you can "nudge"
things around to get past a genapts bug. For instance, if you crahs
consistantly on a valid looking runway, try nudging the heading or
position by a least significant digit. Sometimes we can get numerical
problems with the polygon cliper and this often works around it.
Other considerations:
- Airport generation pre-depends on terrain data being preped and ready
so airport surfaces can be built properly.
- If you prep new terrain data, you should probably rerun the airport
generation step.
Processing VMAP0 data
=====================
Most of the FlightGear terrain features come from the VMAP0 data set
available from:
http://geoengine.nima.mil/
There is a script in src/Prep/TGVPF/ called process.sh which will
generate all the VPF data for all 4 CD's. Look at the script and
understand what it's doing. I don't run it exactly as is because it
takes too long to run everything at once and if something bombs in the
middle, you generally have to start from the beginning just to be on
the safe side and avoid duplications and other potential weirdness.
I usually run each CD individual and copy the data to a separate area
when done. Then I reassemble all the VPF data in the end before
running the final tile builder.
Processing Radio Tower data
===========================
I found some very detailed, very complete, very current radio tower
data from the FCC. This even includes cell phone towers and
individual hobbiest towers. However it only has USA coverage:
http://wireless.fcc.gov/cgi-bin/wtb-datadump.pl
When you unzip this file, the good stuff is in DE.dat, I believe there
is file format information at the above url.
Run:
.../Prep/Tower/tower.pl --input=$RAW_DIR/DE.dat --outdir=$WORK_DIR/TowerObj
This creates a tree of .ind files in $WORK_DIR/TowerObj/
If you add this directory to the list of directories searched by the
build tile program, these entries should get added to the final
scenery. This of course requires the corresponding tower models to be
available in the base package.