USGS - science for a changing world

Woods Hole Coastal and Marine Science Center

Skip Navigation

U.S. Department of the Interior
U.S. Geological Survey

Methods for Compiling and Publishing Coastal and Marine Geology (CMG) Information and Data

Edited by Chris Polloni

Contributions by Jane Denny, Dave Foster, Jim Robb, Barbara Seekins and Glynn Williams

US Geological Survey
Woods Hole, MA 02543

Open-File Report 98-802

September 1999



This report is preliminary and has not been reviewed for conformity with U.S. Geological Survey editorial standards and stratigraphic nomenclature. Any use of tradenames is for descriptive purposes only and does not imply endorsement by the USGS.


Table of contents

Useful Web Links


[ Home ][ About ][ Meetings ][ Metadata ][ Index ][ Advisor ][ suggestions ]



Desktop tools to assemble text and graphics that are suitable for publication have been under intense development in recent years with the sophistication and integration of specialty software. We are now able to design and build pre-press material and in some cases final products with a single low cost (~$5k) personal computer which has access to a myriad of software, hardware and data loosely organized and documented. This paper will discuss some of the tools in use at the USGS Woods Hole Field Center (WHFC) and the standards that are being followed to produce publications for the public and for the USGS data archive system. It will also address issues related to the use of Geographic Information Systems (GIS) which may enhance the process of delivering accurate and timely information both to the general public and to our collaborators and associates. This discussion will include links to the Publications Groups at the USGS National Centers and a description of the interaction and standards that are being provided by these resources as well as the guidelines and processes that need to be followed to get our publications out to the public.

An important issue for the WHFC is compatibility with resources that our beyond our office environment. Compatibility with commercial printers, in-house (USGS) print resources, and anyone who wants to use our products in their environment. They may want to include in their publications and/or maps copies of our data. How to use and reference our data is an issue that needs to be addressed so that we are at least attempting to make our information compatible with most environments. We will in fact work on solutions with anyone who is willing to assist in engineering a reasonably workable solution.

 This paper is the result of dealing with a lot of frustration using software that appeared to be compatible. However, the software may not have had the latest patch or minor update and/or suffered from cross platform issues such as moving files from Microsoft Windows environments to Apple™ and UNIX platforms because we had to get to a system that the graphic artist was using. In many cases cross platform issues are less important than software revision levels. An example is using CorelDraw™. We have been dealing with CorelDraw™ revisions 3.0, 4.0, 5.0, 6.0, 7.0 and various flavors of 7.0, which, if you're not careful, just won’t work unless you have exactly the right revision of software. Another example is ArcView™. We have been using ArcView™ project files to join different GIS coverages into a coherent presentation only to find that ArcView™ 3.1 project files cannot be used with ArcView™ 3.0.

 In many cases sophisticated compression schemes are being implemented that are not necessarily documented within a software release and can only be figured out by carefully keeping up to date with bug sheets and the latest patches. In other cases new extensions are added that provide new capabilities that are incompatible with earlier revisions. Our fact sheets and map files tend to be designed in 10-50Mbyte chunks that tend to test the capability of our desktops but surprisingly many do get transferred over the web and do get loaded when we are using compatible software. But only if we have compatible software! The solution may come down to limiting the exchange of data to software release levels for production work. This would requireindividuals to use the "save as" option and have a published release level for various common software so that everyone can stay synchronized or just get everyone on a previously agreed upon level of software and publicize the releases over the web so that everyone can stay current.

return to table of contents



My focus over the past five years has been to recover data that were acquired in various forms and publish it on CD-ROM. This past year I was asked to make sure that in addition to archiving data, and it's accompanying metadata on CD-ROM I should attempt to document an easy method to print USGS certified paper maps from the digital imagery and text that we are creating. To achieve this capability we need to interface to map editors and to the film writing specialists to eliminate compatibility problems that seemed to plague the products that we create and which cause lengthy delays. We have established some new connections, purchased new software, and gained some critical knowledge that has put us in a position to connect from the digital desktop to the digital film writer thus eliminating costly photographic processes and manual editing that made map publishing a complicated and unnecessarily cumbersome process. 

We have developed map templates that begin within our GIS software and pass off the map graphic to a prepress editing process (CorelDraw™ or Adobe Illustrator™) which develops the print-on-demand product. The WHFC has focused on building web compatible products that users could view and download when and if they were interested in our ArcInfo™ format and compatible with ArcView™ software however the Center staff has not, until recently, had training in using the software necessary to allow them to manipulate the data that is being offered over the web. The same has been true of the CD-ROM’s that we have published. I have gotten feedback from the public and from students and colleagues who have used them but little or no interaction with WHFC staff. Either we chose the wrong format for the data or the staff has no training in the use of digital data whether on CD-ROM or on the web.

 We have a number of staff familiar with graphics software and we have built some interesting clip-art that is utilized extensively. However only a few have been successful in using GIS tools and typically most rely on specialists who have been hired to work with the data and build map products as requested. The need seems to be to make available a browser that can allow staff to inspect data sets easily, construct imagery and text that is simple to work with and provide a template for the delivery of map products. These can then developed, edited and then printed locally or off site either through commercial vendors or using the USGS National Mapping Division (NMD) facilities to build standard products for distribution and sale to the general public. This same logic could be used with our map server to provide access to data layers that are published and have metadata for the purpose of building new map products for decision making or for reference.

 Commercial software tools have arrived recently that make this effort doable but they do require some cooperation and the use of data standards. Most of our cooperators use Arc/Info software and are familiar with the data exchange protocol. Many are using the Arc/Info™ metadata AML tool extension that is available for workstation Arc/Info™. We have published CD-ROM’s with HTML code to make them web compatible for ease of use with an available browser. This structure follows a general design concept that we undertook in 1995 which was to make of our data available through the window of a web browser. 

One of the first experiments was to provide a map maker software package available on the web. It’s intent was to generate maps, by interactive request, that included shoreline, tracks and data points in a familiar format. This package, Coastmap (Signell,1997), at is used extensively by non-center applications. This past year we have begun to make archive CD-ROM’s of raw data on the ship as we collect data. In that environment we have an archive template and methods for dealing with sidescan sonar and seismic reflection data. Standard cruise maps are being created using the UNIX Generic Mapping Tool (GMT) which was demonstrated as a useful scripted mapping package for simple maps. We are also developing real-time displays with MapInfo™ and ArcView™. As these procedures become more familiar they may become the standard methods of map product generation.

return to table of contents



Data integration tools and concepts utilizing page layout (Word™, WordPerfect™, PageMaker™ and Quark Express™) and graphic editors (Photoshop™, CorelDraw™, Illustrator™ and Freehand™) allow the user to prepare final copy of maps and/or imagery that were prepared with PCI™, Earthvision™, ArcInfo™, ArcView™, GMT, ABICAS and MapInfo™. These tools need to have one primary source of data for building layers and it needs to be in a standard form with attributes that define all of the underlying aspects of the history and construction of the data. The center has begun to build central metadata archives that are available as searchable indexes of our data holdings. What we need are standard file types for the data that are used to assemble maps and publications. 

The Arc/Info™ export file format (.E00 file type) has been a defacto standard for data distribution by many state agencies and within the federal government (for example WRD has mandated its use) and the new Spatial Data Transfer Standard (SDTS) has begun to be used. Recently we have found that the Arcview™ shapefile format is quicker and easier for many data distributions. Lacking a single file type we could at least select a group of files that could be used that would be compatible with the previously mentioned software that is in use at the WHFC. We have developed a Federal Geographic Data Committee (FGDC) metadata template that has been in use by the Sea Floor Mapping group for Seismic Swath-Bathymetry and Sidescan sonar data and has been implemented aboard ship as data was archived with documentation to CD-ROM. The CD-ROM template developed by Paskevich et al (Open File Report 96-83) was agreed on last year and provides a means of releasing documented data to the general public as an open file report.

The metadata files created as the data is collected are now being aggregated at the central data server ( and key word indexed (Robb, informal comm.) for the knowledge bank. This index gives an indication of the number of CD-ROM’s created and their content. The CD-ROM archive includes map objects that display coastlines, ships tracklines, and in some cases sample locations. In the near future they will include preliminary sidescan sonar mosaics and bathymetry contour lines.

 Most of the tools that we have at our desk in the office are available on the ship during a cruise due to the portable nature of computing devices and storage systems (JAZ™ drives and CD-ROM’s) that are reliable and inexpensive. What we are lacking are data standards and interoperable software connections that need to be fine tuned to accommodate some of the more subtle problems of data manipulation (compression schemes and platform differences) that cause problems and are not easily resolved. 

We are now dealing with multi-platform, multi-operating system, multi-data within a single computer network aboard ship! Some data layers such as navigation and bathymetry are shared but in different formats and sampling schemes. A master set of original bathymetry and navigation data is archived and used for quality control of the integrated data sets which for whatever reason may end up with some corrupted data. We typically do not have to and do not want to go back to the original data sets because of the extensive re-processing time.

 For the sidescan data, if the navigation (with slant range to vehicle) and the bathymetry are functioning properly, it is possible to construct a sidescan sonar mosaic draped over the bathymetry as the cruise is progressing. This allows for real-time decisions about revising survey tracklines to optimize the goals of the cruise and/or the research goals. The imagery and maps are the initial products that are the building blocks for publications related to the surveys and research. How they are handled and where they are stored depends on who is processing the data. 

The principle data stewards for seismic and sidescan sonar data are Dave Foster and Bill Danforth respectively. Each has created metadata templates for the respective data and each monitors the processing procedures for the software used to translate the raw data into something that can be displayed and eventually published. A sample map that was created soon after a cruise for publication in CorelDraw™ 7.0 is the NYBight Open File Report 97-61 (Schwab et al, Mapping the Sea Floor Geology of The New York-New Jersey Metropolitan Area) which consists of 3 sheets, each with internal graphic files surrounded with text. These maps were constructed successfully in CorelDraw™ 7.0 and printed on a 36 inch wide HP Designjet™ 650C with 64Mbytes of memory . The HP printer was chosen over the 36 inch wide Versatec™ 8400 electrostatic since the color rendition was more consistent and the gray scale imagery appeared to have more contrast. 

We have had good results with the HP Designjet™ as a pre-press resource and for poster sessions. We attempted to have OFR 97-61 printed by a commercial firm in Boston however they did not have the resources to handle the first file (~20Mbyte). Their comment was that they had handled similar imagery for medical applications (x-ray images) by printing the images as photos and including them with the printout as a photo insert. This is not a suitable solution for our application as we wish to use the imagery in digital products with overlays and in some cases as print on demand products to whatever devices are available and in addition we want to archive the data to the web and on CD-ROM.

 A mapping template has been in use by the WHFC that was created by B. Seekins using MAPGEN to set up an outline that specifies cartographic standards for automated maps. This was a postscript format and had many layers. It was a standard format and was developedwith assistance from the USGS/Mapping Application Center. We had a visit from two product generation specialists (Moser communications) that helped get us closer to completing the desktop to film connection. A template with font specifications and general layout is on file (Unger and Baker, September ‘97) and includes a text that describes map construction and procedures. We are currently completing the first set of maps using these techniques that were developed as a result of our collaboration and as a result the production maps will be produced in a timely fashion.

 Some of the things we learned were to pay attention to the font types; use CMYK for color design; purchase the latest Adobe Illustrator™, Photoshop™, and Pagemaker™ with font management for Windows NT™. With these upgrades we are now prepared to review all graphics produced at the WHFC before transmitting information to Reston or any other site. This saves everyone grief and misunderstanding that may occur if minor details are overlooked. As our confidence levels increase we can pursue direct connections for high priority pubs. The New USGS Visual Identity System has arrived and states that it has been developed to provide a consistent organization “look and feel”. I look forward to helping to integrate this philosophy into our products. With the flexibility that is built into the methods that we are using, conforming to the Visual Identity System should be as simple as tuning the color tablet.

return to table of contents



We have been dealing with constant change since we began collecting data. The only aspect that hasn’t changed is our determination to aspire to the highest quality standards and to maintain a professional approach to the way we engineer solutions. The methods proposed here are gathered from the collective experience of a professional team that has a successful track record for proposing, collecting, and publishing some of the most exciting earth science data available for the marine sciences. In all cases, our work begins with an idea or concept that is put on paper as a proposal and then evolves into a funded project plan. 

These proposals and plans are now available at our WHFC intranet site ( and provide insight and reference for all of our work in progress. As we develop field work plans we use this reference material to determine the geographic bounds of the proposed research, the amount of time and the resources required. When we actually start to collect data much of the FGDC metadata document is already filled in thus all that is needed is to expand it and clarify and refine the information. Some of the preliminary track lines are already plotted and saved and are used to verify that data is being collected at the proposed sites. 

The tools used to display the proposed surveys are the same tools being used in the field to verify the collection siting. In some cases we are using previously collected field data as a reference for the new surveys. When we complete these surveys and return to the laboratory we use there same tools to refine the data and build products for publication. The templates and reference data needed to build standard products are available at the WHFC central web site. What is needed is the consensus on what form these products should take and where we should put them as they evolve. The printing of volume products is currently a question of cost and who pays. Ideally, those who need the products should pay and we should print them as needed. 

The computer industry (Dell™, Compaq™, Micron™, Gateway™, and now Apple™) has finally gotten to the point where a user purchases a system over the web and it is built and delivered “after” the internet transaction. Our products really should be no different, in fact, the USGS NMD EROS data center does exactly that! Products are requested from a web site and from that order form a request is processed to order whether it is custom data from a master data set or finished product from archive. Each is assembled and delivered by web request within 7 days of the request. More WHFC data could be available to collaborators and the general public if it was delivered in a similar fashion if configured appropriately and documented. 

We have reports, scanned documents, and the contents of CD-ROM’s reported on the web but no consistent method to deliver data and digital maps. We have acquired the Arcview™ Internet Map Server (IMS) for Arc/Info™ compatible coverage’s and with some expertise could have an extensive on-line database for the region. Arcview™ works best with customized project files and Arcview™ shapefile coverages. These files are easy to assemble and deliver both on the WWW and on CD-ROM. We have plans to build GIS coverages for each major estuary and embayment. We have started with the Gulf of Mexico. Massachusetts Bay, the Gulf of Maine, Long Island Sound and New York/New Jersey Bight are soon to follow as the principle datasets for the IMS.

return to table of contents

Useful Web Links

USGS Publications Policy
   Recommendations for Geologic Division Publications
   What the Regional Publications Groups Do
   USGS Publications Series Designations
   Geologic Division Policy Manual
   Implementing the USGS Visual Identity System in the Geologic Division

Metadata resources
   Federal Geographic Data Committee (FGDC)
   National Geospatial Data Clearinghouse 
   META Manager
   Formal metadata Information and Tools

Marine Data Archive Sites
   Coastal and Marine Information Bank
   Geologic Division Data Coordinator
   Goals for Data Management in the Geologic Division
   Marine Realms Information Bank
   NOAA National Geologic Data Center
   NOAA National Oceanographic Data Center
   Seafloor Mapping Data and Information Server
   USGS Center for Coastal Geology Publications
   USGS Geology Information and Data
   USGS Western Region Coastal Geology Information Bank
   USGS Woods Hole Field Center Data Archives

National Cooperative Geologic Mapping Program

Publications Support Groups
    Eastern Publications Group (Reston)
    Central Publications Group (Denver)
    Western Publications Group (Menlo)

Satellite Data
    CCRS Earth Observation Catalogue (CEOCat)
    USGS Earth Resources Observation Systems (EROS) Data Center
    NASA EOSDIS Information Management System

return to table of contents



Much of the data collected at the WHFC is digital and can be staged in an appropriate format readily as the standards are agreed upon and implemented. The Internet Map Server is a valuable addition to the WHFC web tools and should encourage multi-disciplinary interaction as it evolves as a center resource. There are sufficient tools available at the WHFC to publish an extensive array of GIS resources and data. What is needed is a consensus of how to configure the resources and what the optimal form that the data should take to be widely used and valuable to our customers and decision makers.

return to table of contents



Valuable data is being released over the web from existing WHFC web sites. A focused effort is needed to give the data visual consistency across all platforms and throughout all media. A map and data browser capability should be designed that follows the previously mentioned guidelines and allows all user's access to the WHFC products and especially the GIS functionality that is useful for constructing new maps and reports from the documented data.

return to table of contents



Appreciate the continued support of the Sea Floor Mapping group. Thanks to the WHFC Computer Group for their continued support of the central server network, raid array and printer resources, especially as it relates to the operation of the Internet Map Server.

return to table of contents



Danforth and Polloni, Coastal Mapping and Analysis Technology (CMART) preproposal, March 1997

Foster and Polloni, Methods for Compiling and Publishing Lake Erie Data, April, 1996.

Polloni and others, Implementing FGDC Metadata Standards for USGS Marine Operations Field Data Collection, February, 1997.

Polloni, personal communications, B.L. Makepeace Inc., Boston, MA, Digital Imaging and reprographics, May, 1997.

Polloni, personal communications, Mapping Product standards via NMD/Mapping Applications Center, Ed Moser, September, 1997.

Seekins, TRU Standards for Map Publications, 1990, Map and 6 pages text.

Soller et al, Digital Mapping Techniques, USGS OFR 97-269, part of National Geologic Map Database project, June 1997.

Soller et al Digital Mapping Techniques, USGS OFR 98-xxx, part of National Geologic Map Database project, May 1998.

Unger and Baker, Stellwagen Bank Mapping Procedures, Summer, 1997, Map and 18 pages text.

Adobe, Print Publishing Guide, p/n 0397 0719 (08/95)

Digital Mapping Activities in the Geologic Division, Cross Section, December, 1988.

Geologic Division Guidelines for CD Publications, Draft version 6.0, October 1996.

return to table of contents


Figures and other tables

Figure 1

return to table of contents

Skip Navigation

Accessibility FOIA Privacy Policies and Notices

USAGov logo U.S. Department of the Interior | U.S. Geological Survey

Page Contact Information: WHSC Webmaster
This page last modified on