Getting ready to get it right

1925 0

CAD and/or GI requirements remain one of the more frequently overlooked aspects of a project scope, in terms of resources, budget and time. Yet experience suggests that the more such specialists are consulted, the more likely it is that customer expectations will be met, says James Cutler, eMapSite

In this article, starting from the view that geographic information is critical to the success of a project, we aim to stimulate all participants in projects involving location and using Geographic Information (GI) (as they nearly all do now) to play an active part in delivering business value. The issues raised are relevant to GI and CAD but also to business factors and will result in greater value being attached to the role and contribution of GI expertise.

Playing your part

Regardless of your technical acumen there are common roles and practices that apply to most projects. From the smallest one-man contractor business to the largest enterprises, projects are undertaken by people with different skill sets and foci, each of which contributes in a particular way to the design and outcome of the project.

Individuals can be guilty of viewing projects in terms of their own core skill set or interest rather than from a wider perspective, that of the end user/client/customer/stakeholder. As this implies your end user may at one level believe that you are the experts while at the other end of the spectrum may think that they know everything and you are a mere lackey, so it can be a sensitive task to ensure that the deliverables meet their expectations, within budget, on time and to their advantage!

CAD and/or GI requirements remain one of the more frequently overlooked aspects of a project scope, in terms of resources, budget and time. Yet experience suggests that the more such specialists are consulted, the more likely it is that customer expectations will be met.

Early project stages should identify factors affecting the outcome, deliverables and effectiveness of the project. Such needs assessment should not be restricted to functional issues and must take into account data, personnel and institutional factors. The evaluation, analysis and management of geospatial data necessarily lies at the heart of projects involving geographic information. These elements need to be accommodated or addressed within the design and/or data assimilation stages and thus enable sustainable solutions to be implemented and expectations met.

Advertisement
Advertisement

Requirements and expectations

On the data side in particular evaluating the following factors in the context of project and institutional objectives is essential:
A adequacy (quality, integrity, age/currency, frequency, accuracy, coverage, completeness, reliability, sustainability, consistency, timeliness, scale, resolution, collection, sampling methods etc.)
B accessibility (method of distribution, cost, copyright, royalty, format, metadata, acquisition, delivery etc.)
C sustainability (maintenance, interoperability, security, resilience, software upgrades, compliance, scalability, user authentication, monitoring, documentation, back-up procedures, retention of staff etc.)

A convenient if not entirely accurate classification of the above would assign responsibility for adequacy to the data producers (including surveyors, quality control teams, CAD and GI personnel), responsibility for accessibility to project managers (whether or not they encourage the adoption of processes herein) and responsibility for sustainability to system managers and administrators. In some cases these may be one and same person(s) expediting the path to effective implementation.

Accessibility and sustainability tend to be more operational or institutional factors but ones that are no less important in affecting project or organisational objectives.

Focussing on data adequacy, any evaluation of geospatial data will first consider the purpose for which the data is to be used. In instances where data is still to be collected this will ensure that data is collected at an appropriate scale and according to a suitable sampling methodology to the accuracy and precision required. Where data already exists it should be reviewed through the filter of these factors. This process is intrinsic in ensuring that adequate time, resources and budget are allocated to the preparation of the digital data baseline and for its enhancement during the project.

The quality question

It may of course be pertinent to embark on some quality control to test the integrity of the data and associated metadata in relation to the adequacy of the data for the proposed task. This is certainly the case with digital mapping where modern systems allow infinite levels of zoom display, measurement, analysis, reporting and output. All project members need to be aware of and understand the risks from abuse of the restrictions that the scale of the source data places on what can be done with it.

Quality assurance (QA) is a set of approaches that is consciously applied and, when taken together, tends to lead to a satisfactory outcome for a particular process. A QA system based on these guidelines will employ documented procedural rules, templates and closely managed processes into which various checks are built. Errors in data capture (often referred to as RMS errors, a statistical measure of accuracy) and other metadata can appear arcane to many but few in the CAD and GI community would fail to recognise their role in achieving valid end results. And it is this requirement that fuels additional primary data capture requirements ± more site survey, more detailed sampling, sub-surface survey, terrestrial photogrammetry, 3D modelling etc.

Issues to look out for in quality control of geospatial data include:

• Survey precision

• Digitising precision

• Geometric accuracy (RMS error within data)

• Geocoding accuracy and currency

• Accurate recording of source material including date

• Conformity with standards, for example BS7666, W3C Accessibility, OGC WMS

The flip side of this consideration of primary data sources is consideration of the suitability of off the shelf data for the task at hand (bearing in mind that most carries with it some caveats regarding fitness for specific purposes). For example, OS MasterMap is a seamless geographic data base (map) for the whole country but this does not obscure the fact that some areas are surveyed at different tolerances from others, commensurate with perceived user requirements. The emergence of the renewables sector now means that some of the countryÝs remotest places, traditionally surveyed at 1:10,000 scale, are under the microscope for 3D modelling, temporary route alignment, ZVI and more. Users familiar with OS MasterMap for urban planning are well-advised to recognise the limitations implicit in using OS MasterMap to undertake the same activities for wilderness areas!

Thinking about existing data

Additional issues to be aware of that reflect well in business processes:

Duplicate data sets

Do they exist? If so, is the license valid? If so, is the data of appropriate quality bearing in mind that outwardly similar geospatial data with identical content (attribution) may have been acquired for different purposes using different scales, accuracies, sampling procedures and licensing. This may (or may not) impact their use for a third purpose for example through incompleteness.

Sharing

Institutions are rightly proud and protective of their digital data as it represents a considerable investment of time and resources. Some of the value of such data lies in the information that can be derived from that data by users who combine different data sets to extract new information of use in various aspects of planning, management and decision- making. Offering visibility of such data across the enterprise creates an internal market while scope exists, for example through use of and compliance with OGC Catalog Services to open up the data to third parties for commercial gain.

Harmonisation

Since the dawn of GIS (and IS in general) different schools have sought to build strong brands through proprietary formats and protocols or through open standards. The efforts of the W3C and the OGC mean that most major GIS and business systems do directly or through third party tools support the vast array of geospatial formats in the market place. At the very least this obviates the need for software standardisation and increasingly contributes to the adoption of web services and browser based solutions free of licensing costs.

Risk, hidden cost and professional integrity

There are many ways to cut corners in acquisition of geographic information, over which GI stakeholders can have influence in project design:

• Old map data

• Out of license data

• Scanned data (out of date, out of license, distorted, entailing extra man time to correct)

• Small scale data

• ýScreen scraping¯ (for example of the authorÝs own webpages or those of MultiMap, StreetMap, Google and others)

• Data re-use (for example, in same or overlapping area ± license and currency issues)

• Data sharing (as above)

• Use of third party material either directly or as a source for deriving your desired information (aerial photography, satellite imagery, scanned maps -all carry licensing and currency risks)

The so called ýgrey market¯ from these activities is believed to be worth some ú10-ú15m per annum in lost royalties to the intellectual property rights (IPR) holders (less than a fraction of a % of the value of land and property related business in the GB each year), a fact not lost on them or their legal teams as for example recent court cases by Ordnance Survey (vs. the AA) testify.

Self-evidently digital geographic data are not ýjust¯ lines on a map, appearing out of the ether and lasting in perpetuity through all the changes wrought on the landscape and the IPR holders invest continuously in their products to maintain their value and fitness for purpose. For the user factors such as professional integrity, risk mitigation, service differentiation and quality management systems increasingly mean that enterprises large and small set great store by the transparent, compliant, efficient practices and processes across their operations, including legitimate use of intellectual property.

Conclusion

Firstly, avoid being a nay-sayer and doom-monger! There will always be a few ostriches who will want to ignore opportunity cost and insist that ýit¯ can be done for less. Instead focus on the customer, their expectations and the business case for an approach that is cognisant and inclusive of the need for and role of geographic data. Hopefully, budgets, resources and timeframes will flex in recognition of your insight and value!

This article was written by James Cutler, CEO at eMapSite, a platinum partner of Ordnance Survey and online mapping service to professional users

www.emapsite.com

Advertisement

Leave a comment