Best Practices For Tokenization Projects Involving Data In Transit
Last week, I described a list of considerations for any payment organization considering the role to Token Service Provider or TSP. This week I will review a set of methodologies and practices designed to address to the issues and challenges involved in delivering a solution that addressing the complexities associated with tokenization.
Organizations who wish to provide this type of service will need to structure associated work into several phases tailored to the deployment of a tokenization environment. Each phase will need to be comprehensive in cataloguing and addressing all the impacts tokenization can have on a payments organization.
The project should incorporate a vertically integrated approach that includes the ability to catalogue and comprehend the business drivers for the TSP service the company wishes to deploy. The following steps should be part of an initial consultative phase of the project.
Understanding the business drivers: Working with the key organizational goals and market drivers to identify the business cases for implementing tokenization should be documented.
Evaluate the prioritization from both business and technology perspective: Using the outcome from establishing the baseline business case, the organizational goals and market drivers are mapped to a set of possible use cases define the associated implications for the business and its IT infrastructure.
Establish a gap analysis: Once the business case is approved and the use cases established, a gap analysis is then conducted that includes an analysis of the “build versus” options. The options should include various scenarios (e.g., hybrid options that include product and customized components of the solution) and their implications for initial and on going costs.
A framework should be applied as part of this initial phase of the project to understand, document, evaluate and present recommendations. This framework should feature a set of key criteria that serve as the key reference document in the remaining phases of the project.
During the requirements phase of a tokenization project these areas will need to be reviewed and the information noted gathered to define the overall needs of the organization relative to the solution:
Checklist for requirement analysis (functional and non-functional): Checklist should have the key points to cover in the requirement analysis reducing the iteration and rework on the detailed requirement specification and integration definition. For instance, the checklist should cover specific needs on:
Member profile management application
- Number of token bins and token ranges to be provisioned
- Token requester registration related changes
- Token service participation related changes
- New message type creation
- New data element creation
- New data element values for existing/new data elements
- Business rules corresponding to domain control
- Business rules corresponding to token assurance level
- Existing authentication services that requires de-tokenized PAN
- Transaction monitoring process
- Transaction logging process
Clearing & Settlement and Dispute Management System
- Token-PAN distinguishability
- New message creation
- New service request creation
Data Services System
- New token data elements
- Token-Pan distinguishability
Questionnaire to enable faster identification of specific needs: A comprehensive questionnaire should be the supporting artifact to establish key requirements, as well as the primary approach to the build, integrate and implementation phase for the solution. Focusing on the areas below in the questionnaire will help maximize the time of subject matter experts (SMEs) and architects working on the project:
• Token bin provisioning process
• New message and data element creation process
• Any specific database management related guidelines to be followed for Token Vault creation and commissioning
• Mode of communication between Authorization, C&S, DMS systems etc.
• Technical documentation related guidelines
• Message types and data elements to be defined for Token generation and de-tokenization requests/responses in case of ISO/XML API parameters for web-based service calls
Build, Integrate and Implement Phase
Depending on the outcome of the previous phases, this phase could be a build, buy and integrate or a hybrid of the two. Whatever the decision, an independent Verification and Validation (V/V) service that provides a complete traceability from the business case and use cases to the requirements, design and testing should be incorporated into the project.
Reusable artifacts can accelerate establishing a baseline for design and speed the start of development and testing with specific alignment with organization’s requirements. Many organizations will have to seek external partners with a focused practice on digital payments with artifacts that include:
Baseline data model and interface definitions
- Use cases and data flows for different transaction scenarios
- Interface API with required and optional parameters
Integration test scenarios
- Test matrix for functional testing
- Test matrix for integration testing
- Traceability matrix for making sure the use cases are properly integrated into the requirements and the requirements are properly covered by the test cases.
In addition to considering a partner’s specific expertise in tokenization and best practices associated with it, an organization should consider that company’s specific expertise in the payments industry.
If an organization utilizes outside assistance, as many will, they should look for partners with a proven track record that features delivering solutions that can be maintained and expanded easily in an environment that requires continuous availability and high performance.
Next week, I will describe our organization’s approach to assisting payments organizations involved in the creation and management of tokens.