WCO Data Model: the bridgehead to connectivity in international trade Satya Prasad Sahu is Commissioner (Single Window), Indian Customs, and Laure Tempier is from the World Customs Organization Issues related to redundancies in documentation and data required by government agencies have been addressed extensively over the last two decades, and several recommendations and guidelines have been developed to reduce the trader’s effort, cost and time in carrying out regulatory formalities. Principal among them is the use of international standards for data and documentation. In this domain, the WCO is promoting the use of its Data Model, a collection of international standards on data and information required not only by Customs, but also by government agencies, in relation to the regulation of cross-border trade. Studies conducted with regard to the time and cost involved in carrying out activities related to international trade transactions have underscored the significant impact of tasks associated with ‘document preparation’ and procedural formalities. It is commonly acknowledged that considerable redundancy exists in the information that a trader is required to provide for carrying out commercial, transport and regulatory procedures. The initiative by Australia in 2005 to develop a standardized data set for international trade illustrates the scale of the problem. With Customs as the lead agency, the Standardized Data Set (SDS) Project was a whole-of-government exercise to establish a common platform for the submission to government of import, export and transit data. The SDS project team collected 7,649 data elements from 41 government agencies. This number was reduced to 3,993 with the elimination of ‘same as’ elements within agencies, and harmonized to 650 after a first quick review. To illustrate, 22 agencies collected the name of the exporter on 118 different forms. It was used 212 times on those 118 forms, described in 61 different ways, and required in 16 different formats, ranging from 20 to 300 characters in length. The SDS project team standardized this entry to one data element – ‘Exporter Name’, 70 characters in length. Issues related to redundancies in documentation and data, complicated regulatory procedures and paper-based processes have been addressed extensively over the last two decades, and several recommendations and guidelines have been developed to reduce the trader’s effort, cost and time in carrying out regulatory formalities. Principal among them is the use of international standards for data and documentation. International data standards When data is interchanged between trade partners by means other than paper documents, eg. by tele-transmission methods including direct exchange between computer systems, a common ‘language’ should be used with an agreed mode of expressing it, ie. common protocols, message identification, agreed abbreviations or codes for data representation, message and data element separators, etc. If a universally accepted standard is not used, the ‘language’ has to be agreed bilaterally between each pair of interchange partners. Taking into account the large number of parties exchanging data for an international trade transaction and the ever increasing number of potential users of tele-transmission techniques, it is obvious that such a bilateral approach is not viable. Currently, there are at least three ‘data models’ for international trade, that is to say, models which organize data elements and standardize how the data elements relate to one another. The oldest is the United Nations (UN) Trade Data Element Directory (TDED). The TDED has 1,083 elements and their definitions are available on the web pages of the UN Economic Commission for Europe (UNECE). The TDED is closely linked to paper-based forms and contains information about the location of each data element on a standard paper layout. The TDED is just a list of data elements and does not explain how to combine data elements into meaningful information and arrange them on an electronic template. However, TDED continues to remain relevant even in the electronic environment as it provides the basis for the UN Electronic Data Interchange for Administration, Commerce and Transport (UN/EDIFACT). UN/EDIFACT comprises a set of internationally agreed standards, directories and guidelines for the electronic interchange of structured data, and in particular, that related to trade in goods and services between independent, computerized information systems. TDED helps construct the UN/EDIFACT components which are commonly used in Electronic Data Interchange (EDI). The Core Components Library (CCL), with over 6,000 elements, could be said to be a further development on the TDED. The CCL has a wider scope and, unlike TDED, includes the concepts of data modelling. By following a set of technical rules, one can build electronic business documents. Of the 6,000 or so elements defined in the CCL, not all TDED elements are explicitly included. This is probably due to a more systematic approach in the CCL, which can represent several TDED elements in a more generalized CCL definition. The WCO Data Model draws heavily from the above standards. First, it is by and large cross-referenced to the TDED. The WCO Data Model is also expressed through a standard UN/EDIFACT GOVCBR (Government Cross-Border Regulatory) message. The modelling principles of the WCO Data Model are largely similar to the CCL, as both are based on the Core Component Technical Specifications. The WCO Data Model has been assessed extensively for compliance with the generic international data standards developed not only by the UN, but also by the International Organization for Standardization (ISO), which form its building blocks. ISO codes include Country Code (ISO 3166), Currency Code (ISO 4217), Dates, times, periods of time (ISO 8601), and Trade Data Elements (UNTDED - ISO 7372). It may be said that the WCO Data Model is a value-added product based on all these well-recognized data standards. It responded to the need for an international data dictionary for the Customs domain that would both harmonize and simplify Customs data requirements. The WCO Data Model The WCO Data Model is a collection of international standards on data and information required by government agencies in relation to the regulation of cross-border trade. This collection was developed by the WCO after careful examination of all relevant international instruments and guidelines, along with national and industry practice, with the objective of achieving a consensus on the manner in which data will be used in applying regulatory controls in global trade. The Data Model contains data sets for different border procedures, including definitions of data elements, recommended data formats and suggested code lists. The data elements are logically grouped into units of meaningful information, called “information models”. These information models serve as reusable building blocks with which one can build electronic document and data exchange templates. The Data Model also includes Information Packages, which are standard electronic templates linked to business processes – goods declarations, cargo reports, conveyance reports, licences/permits, and certificates. It is a library of data components and electronic document templates that can be used to effectively exchange business data. The fact that the Data Model uses globally recognized standards as its building blocks provides assurance of worldwide acceptance and adherence. Information Packages As mentioned earlier, the concept of Information Packages is unique to the WCO Data Model. Information Packages describe the various profiles of the different ways in which the Data Model can be used. Starting with very generic templates of information exchange, called Base Information Packages, Derived Information Packages, reflecting the profiles of usage with well-defined legal contexts, have been developed. Each type of business application of the Data Model may have its own Derived Information Package. For example, for the ship-port interface, there is the International Maritime Organization (IMO) FAL Derived Information Package. For import, export and transit goods declarations, there is the Single Administrative Document (SAD) Information Package. The European Union (EU) has endeavoured to produce specifications for its new Union Customs Code Implementing Provisions based on the WCO Data Model. It is also understood to be considering the production of an EU Customs Information Package that reflects the interface requirements of all its member states. User profiles The WCO closely monitors the worldwide adoption and use of the Data Model. Since its launch, WCO members have pursued an approach to suit their unique national situations. A number of members have used the Data Model in national projects to upgrade national IT systems or to develop Single Window solutions. Under a SW, government agencies merge their respective goods declarations through a process of harmonization and standardization of data. The Data Model has been very helpful in harmonizing the documentary requirements of different government agencies. For example, Canada has produced an Integrated Import Declaration, which contains the requirements of all participating agencies. The implementation of border controls for certain groups of commodities usually involves licences, permits and certificates, which are issued by government agencies. These documents are interchanged by parties within and between countries. The Data Model Information Packages for Licences, Permits, Certificates and Other Authorizations (LPCO) have been developed precisely for this purpose. The freight forwarders industry operates interfaces with Customs in order to provide crucial advance cargo information at various entry and exit points. Some of these interfaces may require pre-loading, pre-departure or pre-arrival cargo information, which calls for business agility and global data interoperability for interchange of information. Thus, in a transport contract requiring the provision of advance information to Japan and the EU for example, a freight forwarder can develop an interface based on the Data Model to successfully provide the required information. Relying on data from export transactions from one country, a Customs broker can rapidly prepare the basic data for import in the country of import. To efficiently manage cross-border information flows, it is necessary to understand the requirements of different countries and legal regimes, using a global framework based on the Data Model and the associated local Information Packages. Initiating adoption of the WCO Data Model Whether it is a government agency or an entity in the private sector associated with international trade and transport, the decision to adopt or use the WCO Data Model involves a critical review of internal functional systems and software applications. Since the exchange of information between organizations is involved, it is also necessary to develop an appropriate project concept and stakeholder engagement model. For Customs administrations, this means consulting national cross-border regulatory agencies in particular. Each organization must review its existing business architecture and analyse its current information flows. Thereafter, the role played by the Data Model should be assessed. If a business model that ensures a favourable cost-benefit situation exists, then the organization should prepare a business plan involving the relevant functional, policy and information technology (IT) managers. The Data Model is available free of charge to all government agencies and organizations, and to all other interested parties at a reasonable cost. Where an organization in the private sector is considering adopting or using the Data Model, it must develop an appropriate business model and contact the WCO in order to obtain the appropriate terms and conditions of use. The WCO offers the Data Model to all interested private sector entities on comparable terms of use on a non-exclusive basis. One such entity in the private sector is GEFEG, which has a distribution agreement with the WCO. It is a company that has worked in the area of metadata development software for several years. Using its trademarked product ‘GEFEG.FX’, it currently markets the WCO Data Model in a structured reusable format, which offers efficient options for the rapid production of WCO Data Model implementation profiles, also referred to as Information Packages. In order to produce conformant e-documents and messages, one must identify, tabulate, map and model the data contained in the prescribed forms with the Data Model. If this is done manually, it could take a considerable amount of time and effort. Using the GEFEG tool, this task is facilitated through a process of rapid customization. The GEFEG tool is being used by Customs administrations, individual experts, governmental agencies, traders and other parties involved in cross-border processes. GEFEG, through its innovative solutions, has promoted the use of the Data Model by showing how it offers significant interoperability and conformance benefits to Customs administrations and partner governmental agencies by providing standardized cross-border data structures. The future The future of computing will be defined largely by ubiquitous, highly available computing facilities that will be used by mobile users to access applications on a variety of devices. These applications will depend upon, and result in, real-time information flows in step with progress in the execution of business processes. This is largely reflected in trends in the sphere of online commercial procurement, electronic invoicing, supply chain finance, the management of transport service requests, real-time inventory tracking and warehouse management solutions, fleet and load management, etc. All these developments point to an environment where supply chain information will be available to be exploited in real-time for regulatory purposes. This concept is explored in projects that are testing ‘data pipelines’ that carry business data across the entire length of the logistical supply chain and can provide real-time, ‘on-demand’ data to Customs and other regulatory agencies. This data is understood to be obtained first-hand from the source, uncontaminated by quality issues that creep in due to repeated transcription and interpretation by participating supply-chain actors. With minimal intervention, data entry and other costs, this approach is a win-win for both industry and government as it leaves fewer activities requiring intermediating efforts in data handling, allowing the industry and the Customs brokerage community to deploy its resources on crucial issues of local compliance. This futuristic scenario is already a technological reality which is being exploited by an increasing number of business applications. Customs and regulatory agencies are still some distance away from realizing its benefits. The Data Model, with its Information Packages, promises to be just the right resource to capture and convey critical local knowledge to help build local interfaces based on global information flows that the ‘plumbers’ of the internet age can use to quickly help offer seamless flows of information, as envisaged in the ‘data pipeline’ concept. More information dm@wcoomd.org www.wcodatamodel.org All these developments point to an environment where supply chain information will be available to be exploited in real-time for regulatory purposes