You are on page 1of 161

Informatica PowerCenter Data Validation Option (Version 9.5.

0)

Installation and User Guide

Informatica PowerCenter Data Validation Option Version 9.5.0 July 2012 Copyright (c) 1998-2012 Informatica. All rights reserved. This software and documentation contain proprietary information of Informatica Corporation and are provided under a license agreement containing restrictions on use and disclosure and are also protected by copyright law. Reverse engineering of the software is prohibited. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica Corporation. This Software may be protected by U.S. and/or international Patents and other Patents Pending. Use, duplication, or disclosure of the Software by the U.S. Government is subject to the restrictions set forth in the applicable software license agreement and as provided in DFARS 227.7202-1(a) and 227.7702-3(a) (1995), DFARS 252.227-7013(1)(ii) (OCT 1988), FAR 12.212(a) (1995), FAR 52.227-19, or FAR 52.227-14 (ALT III), as applicable. The information in this product or documentation is subject to change without notice. If you find any problems in this product or documentation, please report them to us in writing. Informatica, Informatica Platform, Informatica Data Services, PowerCenter, PowerCenterRT, PowerCenter Connect, PowerCenter Data Analyzer, PowerExchange, PowerMart, Metadata Manager, Informatica Data Quality, Informatica Data Explorer, Informatica B2B Data Transformation, Informatica B2B Data Exchange Informatica On Demand, Informatica Identity Resolution, Informatica Application Information Lifecycle Management, Informatica Complex Event Processing, Ultra Messaging and Informatica Master Data Management are trademarks or registered trademarks of Informatica Corporation in the United States and in jurisdictions throughout the world. All other company and product names may be trade names or trademarks of their respective owners. Portions of this software and/or documentation are subject to copyright held by third parties, including without limitation: Copyright DataDirect Technologies. All rights reserved. Copyright Sun Microsystems. All rights reserved. Copyright RSA Security Inc. All Rights Reserved. Copyright Ordinal Technology Corp. All rights reserved.Copyright Aandacht c.v. All rights reserved. Copyright Genivia, Inc. All rights reserved. Copyright Isomorphic Software. All rights reserved. Copyright Meta Integration Technology, Inc. All rights reserved. Copyright Intalio. All rights reserved. Copyright Oracle. All rights reserved. Copyright Adobe Systems Incorporated. All rights reserved. Copyright DataArt, Inc. All rights reserved. Copyright ComponentSource. All rights reserved. Copyright Microsoft Corporation. All rights reserved. Copyright Rogue Wave Software, Inc. All rights reserved. Copyright Teradata Corporation. All rights reserved. Copyright Yahoo! Inc. All rights reserved. Copyright Glyph & Cog, LLC. All rights reserved. Copyright Thinkmap, Inc. All rights reserved. Copyright Clearpace Software Limited. All rights reserved. Copyright Information Builders, Inc. All rights reserved. Copyright OSS Nokalva, Inc. All rights reserved. Copyright Edifecs, Inc. All rights reserved. Copyright Cleo Communications, Inc. All rights reserved. Copyright International Organization for Standardization 1986. All rights reserved. Copyright ej-technologies GmbH . All rights reserved. Copyright Jaspersoft Corporation. All rights reserved. Copyright is International Business Machines Corporation. All rights reserved. Copyright yWorks GmbH. All rights reserved. Copyright Lucent Technologies 1997. All rights reserved. Copyright (c) 1986 by University of Toronto. All rights reserved. Copyright 1998-2003 Daniel Veillard. All rights reserved. Copyright 2001-2004 Unicode, Inc. Copyright 1994-1999 IBM Corp. All rights reserved. Copyright MicroQuill Software Publishing, Inc. All rights reserved. Copyright PassMark Software Pty Ltd. All rights reserved. Copyright LogiXML, Inc. All rights reserved. Copyright 2003-2010 Lorenzi Davide, All rights reserved. Copyright Red Hat, Inc. All rights reserved. Copyright The Board of Trustees of the Leland Stanford Junior University. All rights reserved. Copyright EMC Corporation. All rights reserved. This product includes software developed by the Apache Software Foundation (http://www.apache.org/), and other software which is licensed under the Apache License, Version 2.0 (the "License"). You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0. Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. This product includes software which was developed by Mozilla (http://www.mozilla.org/), software copyright The JBoss Group, LLC, all rights reserved; software copyright 1999-2006 by Bruno Lowagie and Paulo Soares and other software which is licensed under the GNU Lesser General Public License Agreement, which may be found at http:// www.gnu.org/licenses/lgpl.html. The materials are provided free of charge by Informatica, "as-is", without warranty of any kind, either express or implied, including but not limited to the implied warranties of merchantability and fitness for a particular purpose. The product includes ACE(TM) and TAO(TM) software copyrighted by Douglas C. Schmidt and his research group at Washington University, University of California, Irvine, and Vanderbilt University, Copyright () 1993-2006, all rights reserved. This product includes software developed by the OpenSSL Project for use in the OpenSSL Toolkit (copyright The OpenSSL Project. All Rights Reserved) and redistribution of this software is subject to terms available at http://www.openssl.org and http://www.openssl.org/source/license.html. This product includes Curl software which is Copyright 1996-2007, Daniel Stenberg, <daniel@haxx.se>. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http://curl.haxx.se/docs/copyright.html. Permission to use, copy, modify, and distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. The product includes software copyright 2001-2005 () MetaStuff, Ltd. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http://www.dom4j.org/ license.html. The product includes software copyright 2004-2007, The Dojo Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http://dojotoolkit.org/license. This product includes ICU software which is copyright International Business Machines Corporation and others. All rights reserved. Permissions and limitations regarding this software are subject to terms available at http://source.icu-project.org/repos/icu/icu/trunk/license.html. This product includes software copyright 1996-2006 Per Bothner. All rights reserved. Your right to use such materials is set forth in the license which may be found at http:// www.gnu.org/software/ kawa/Software-License.html. This product includes OSSP UUID software which is Copyright 2002 Ralf S. Engelschall, Copyright 2002 The OSSP Project Copyright 2002 Cable & Wireless Deutschland. Permissions and limitations regarding this software are subject to terms available at http://www.opensource.org/licenses/mit-license.php. This product includes software developed by Boost (http://www.boost.org/) or under the Boost software license. Permissions and limitations regarding this software are subject to terms available at http://www.boost.org/LICENSE_1_0.txt. This product includes software copyright 1997-2007 University of Cambridge. Permissions and limitations regarding this software are subject to terms available at http:// www.pcre.org/license.txt. This product includes software copyright 2007 The Eclipse Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http:// www.eclipse.org/org/documents/epl-v10.php. This product includes software licensed under the terms at http://www.tcl.tk/software/tcltk/license.html, http://www.bosrup.com/web/overlib/?License, http://www.stlport.org/ doc/ license.html, http://www.asm.ow2.org/license.html, http://www.cryptix.org/LICENSE.TXT, http://hsqldb.org/web/hsqlLicense.html, http://httpunit.sourceforge.net/doc/ license.html, http://jung.sourceforge.net/license.txt , http://www.gzip.org/zlib/zlib_license.html, http://www.openldap.org/software/release/license.html, http://www.libssh2.org, http://slf4j.org/license.html, http://www.sente.ch/software/OpenSourceLicense.html, http://fusesource.com/downloads/license-agreements/fuse-message-broker-v-5-3- licenseagreement; http://antlr.org/license.html ; http://aopalliance.sourceforge.net/ ; http://www.bouncycastle.org/licence.html; http://www.jgraph.com/jgraphdownload.html; http:// www.jcraft.com/jsch/LICENSE.txt . http://jotm.objectweb.org/bsd_license.html; . http://www.w3.org/Consortium/Legal/2002/copyright-software-20021231; http:// developer.apple.com/library/mac/#samplecode/HelpHook/Listings/HelpHook_java.html;http://nanoxml.sourceforge.net/orig/copyright.html; http://www.json.org/license.html; http://forge.ow2.org/projects/javaservice/, http://www.postgresql.org/about/licence.html, http://www.sqlite.org/copyright.html, http://www.tcl.tk/software/tcltk/license.html, http://

www.jaxen.org/faq.html, http://www.jdom.org/docs/faq.html, http://www.iodbc.org/dataspace/iodbc/wiki/iODBC/License; http://www.keplerproject.org/md5/license.html; http:// www.toedter.com/en/jcalendar/license.html; http://www.edankert.com/bounce/index.html; http://www.net-snmp.org/about/license.html; http://www.openmdx.org/#FAQ; http:// www.php.net/license/3_01.txt; http://srp.stanford.edu/license.txt; and http://www.schneier.com/blowfish.html; http://www.jmock.org/license.html; and http://xsom.java.net. This product includes software licensed under the Academic Free License (http://www.opensource.org/licenses/afl-3.0.php), the Common Development and Distribution License (http://www.opensource.org/licenses/cddl1.php) the Common Public License (http://www.opensource.org/licenses/cpl1.0.php), the Sun Binary Code License Agreement Supplemental License Terms, the BSD License (http:// www.opensource.org/licenses/bsd-license.php) the MIT License (http://www.opensource.org/licenses/mitlicense.php) and the Artistic License (http://www.opensource.org/licenses/artistic-license-1.0). This product includes software copyright 2003-2006 Joe WaInes, 2006-2007 XStream Committers. All rights reserved. Permissions and limitations regarding this software are subject to terms available at http://xstream.codehaus.org/license.html. This product includes software developed by the Indiana University Extreme! Lab. For further information please visit http://www.extreme.indiana.edu/. This Software is protected by U.S. Patent Numbers 5,794,246; 6,014,670; 6,016,501; 6,029,178; 6,032,158; 6,035,307; 6,044,374; 6,092,086; 6,208,990; 6,339,775; 6,640,226; 6,789,096; 6,820,077; 6,823,373; 6,850,947; 6,895,471; 7,117,215; 7,162,643; 7,243,110; 7,254,590; 7,281,001; 7,421,458; 7,496,588; 7,523,121; 7,584,422; 7,676,516; 7,720,842; 7,721,270; and 7,774,791, international Patents and other Patents Pending. DISCLAIMER: Informatica Corporation provides this documentation "as is" without warranty of any kind, either express or implied, including, but not limited to, the implied warranties of noninfringement, merchantability, or use for a particular purpose. Informatica Corporation does not warrant that this software or documentation is error free. The information provided in this software or documentation may include technical inaccuracies or typographical errors. The information in this software and documentation is subject to change at any time without notice. NOTICES This Informatica product (the "Software") includes certain drivers (the "DataDirect Drivers") from DataDirect Technologies, an operating company of Progress Software Corporation ("DataDirect") which are subject to the following terms and conditions: 1. THE DATADIRECT DRIVERS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT. 2. IN NO EVENT WILL DATADIRECT OR ITS THIRD PARTY SUPPLIERS BE LIABLE TO THE END-USER CUSTOMER FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF THE USE OF THE ODBC DRIVERS, WHETHER OR NOT INFORMED OF THE POSSIBILITIES OF DAMAGES IN ADVANCE. THESE LIMITATIONS APPLY TO ALL CAUSES OF ACTION, INCLUDING, WITHOUT LIMITATION, BREACH OF CONTRACT, BREACH OF WARRANTY, NEGLIGENCE, STRICT LIABILITY, MISREPRESENTATION AND OTHER TORTS. Part Number: PC-DVO-95000-0001

Table of Contents
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .viii
Informatica Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii Informatica Customer Portal. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii Informatica Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii Informatica Web Site. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii Informatica How-To Library. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii Informatica Knowledge Base. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix Informatica Multimedia Knowledge Base. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix Informatica Global Customer Support. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix

Chapter 1: Data Validation Option Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1


Data Validation Option Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Data Validation Option Users. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Data Validation in an Enterprise. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Data Validation Workflow. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Architecture. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 System Requirements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 Data Validation Methodology. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 Basic Counts, Sums, and Aggregate Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 Check Referential Integrity of Target Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 Enforce Constraints on Target Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 Compare Individual Records between Sources and Targets . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 Data Validation with Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

Chapter 2: New Features and Behavior Changes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6


New Features and Enhancements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 New Features and Enhancements in 9.5.0. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 New Features and Enhancements in 9.1.4.0. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 New Features and Enhancements in 9.1.2.0. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 New Features and Enhancements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 Behavior Changes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 Behavior Changes in 9.1.4.0. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 Behavior Changes in 9.1.2.0. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 Behavior Changes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

Chapter 3: Data Validation Option Client Layout. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13


Data Validation Option Client Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 Tabs Available in Data Validation Option Client Layout. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

Table of Contents

Tests Tab. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 SQL Views Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 Lookup Views Tab. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 Join Views Tab. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 Folders. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 Copying Folders. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 Copying Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 Menus. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 Settings Folder. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

Chapter 4: Data Validation Option Installation and Configuration. . . . . . . . . . . . . . . . . . . 20


Data Validation Option Installation and Configuration Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 Prerequisites . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 PowerCenter Support for Data Validation Option Features. . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 System Permissions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 Information Required for Installation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 Installing and Configuring Data Validation Option for the First Time. . . . . . . . . . . . . . . . . . . . . . . . . 22 Data Validation Option Configuration for Additional Users. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 Data Validation Option Upgrade. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 Upgrading from Version 9.1.x to Version 9.5.0. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 Upgrading from Version 3.0 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 Upgrading from Version 3.1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 DVOCmd Installation on UNIX. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 Environment Variables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 Modifying the License Key. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 JasperReports Server Setup. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

Chapter 5: Data Validation Option Management. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32


Data Validation Option Management Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 User Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 Preferences File. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 Changing the User Configuration Directory through a Batch File. . . . . . . . . . . . . . . . . . . . . . . . 33 Changing the User Configuration Directory through an Environment Variable. . . . . . . . . . . . . . . 33 Multiple Data Validation Option Repository Access. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 Multiple PowerCenter Installation Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 Informatica Authentication. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 Data Validation Option Users and Informatica Users. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 Security. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 Informatica Authentication Parameters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 Configuring Informatica Authentication. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

Chapter 6: Repositories. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Repositories Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

ii

Table of Contents

Adding a Repository. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 Editing Repositories. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 Deleting Repositories. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 Refreshing Repositories. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 Exporting Repository Metadata. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 Metadata Export and Import. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 Exporting Metadata. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 Importing Metadata. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 Metadata Manager Integration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 Configuring Metadata Manager Integration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

Chapter 7: Table Pairs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41


Table Pairs Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 Table Pair Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 Connection Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 Database Processing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 Pushing Test Logic to the Database. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 WHERE Clauses. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 Table Joins. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 Bad Records Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 Parameterization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 Adding Table Pairs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 Editing Table Pairs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Deleting Table Pairs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Viewing Overall Test Results. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

Chapter 8: Tests for Table Pairs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51


Tests for Table Pairs Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 Test Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 Fields A and B. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 Conditions A and B. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 Operator. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 Threshold. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 Max Bad Records. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 Case Insensitive. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 Trim Trailing Spaces. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 Null = Null. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 Comments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 Expression Definitions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 Expression Tips. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 Adding Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 Editing Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57

Table of Contents

iii

Deleting Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 Running Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 Automatic Test Generation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 Generating Table Pairs and Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 Generating Tests for Table Pairs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 Compare Columns by Position. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60 Bad Records. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61

Chapter 9: Single-Table Constraints. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62


Single-Table Constraints Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 Single Table Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 Connection Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 Bad Records Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 Parameterization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 Adding Single Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 Editing Single Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 Deleting Single Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 Viewing Overall Test Results. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

Chapter 10: Tests for Single-Table Constraints. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70


Tests for Single-Table Constraints Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 Test Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 Field. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 Condition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 Operator. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 Constraint Value. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 Remaining Controls on Test Editor. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 Adding Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 Editing Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 Deleting Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 Running Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 Bad Records. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75

Chapter 11: SQL Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76


SQL Views Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 SQL View Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 Table Definitions and Connection. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 Column Definition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 SQL Statement. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 Comment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 Adding SQL Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78

iv

Table of Contents

Editing SQL Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 Deleting SQL Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78

Chapter 12: Lookup Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79


Lookup Views Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 Lookup View Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 Selecting Source and Lookup Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 Selecting Connections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 Overriding Owner Name. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 Source Directory and File. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 Description. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 Source to Lookup Relationship. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 Adding Lookup Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 Editing Lookup Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 Deleting Lookup Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 Lookup Views Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 Joining Flat Files or Heterogeneous Tables using a Lookup View. . . . . . . . . . . . . . . . . . . . . . . . . . 83

Chapter 13: Join Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84


Join Views Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 Join View Data Sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 Join View Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 Connection Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 Database Optimization in a Join View. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 Join Types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 Alias in Join View. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 Join Conditions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 Adding a Join View. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 Configuring a Table Definition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 Configuring a Join Condition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 Managing Join Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 Join View Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90

Chapter 14: Reports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93


Reports Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 Business Intelligence and Reporting Tools (BIRT) Reports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 BIRT Report Generation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 SQL and Lookup View Definitions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 Custom Reports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 Viewing Reports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 Jasper Reports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 Status in Jasper Reports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 Configuring Jaspersoft Reporting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97

Table of Contents

Generating a Report. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 Jasper Report Types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99 Dashboards. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 Metadata Manager Integration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 Configuring Metadata Manager Integration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102

Chapter 15: Command Line Integration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103


Command Line Integration Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103 CopyFolder. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104 CreateUserConfig. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 DisableInformaticaAuthentication. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 ExportMetadata. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106 ImportMetadata. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 InstallTests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 Cache Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 LinkDVOUsersToInformatica. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 PurgeRuns. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 RefreshRepository. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 RunTests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 Cache Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 UpdateInformaticaAuthenticationConfiguration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 UpgradeRepository. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113

Chapter 16: Troubleshooting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114


Troubleshooting Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114 Troubleshooting Initial Errors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114 Troubleshooting Ongoing Errors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115 Troubleshooting Command Line Errors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116

Appendix A: Datatype Reference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117


Test, Operator, and Datatypes Matrix for Table Pair Tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 Test, Operator, and Datatypes Matrix for Single-Table Constraints. . . . . . . . . . . . . . . . . . . . . . . . 118

Appendix B: BIRT Report Examples. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119


Summary of Testing Activities. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 Table Pair Summary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120 Detailed Test Results Test Page. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122 Detailed Test Results Bad Records Page. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123

Appendix C: Jasper Report Examples. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124


Home Dashboard. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124 Repository Dashboard. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125 Folder Dashboard. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126

vi

Table of Contents

Tests Run Vs Tests Passed. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126 Total Rows Vs Percentage of Bad Records . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127 Most Recent Failed Runs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127 Last Run Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128

Appendix D: Reporting Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129


Reporting Views Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129 results_summary_view. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129 rs_bad_records_view. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132 results_id_view. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133 meta_sv_view. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133 meta_lv_view. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134 meta_jv_view. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135 meta_ds_view. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136 meta_tp_view. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136 rs_sv_id_view, rs_lv_id_view, and rs_jv_id_view. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137

Appendix E: Metadata Import Syntax. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138


Metadata Import Syntax Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138 Table Pair with One Test. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138 Table Pair with an SQL View as a Source. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139 Table Pair with Two Flat Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139 Single-Table Constraint. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140 SQL View. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141 Lookup View. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141

Appendix F: Glossary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142 Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145

Table of Contents

vii

Preface
The PowerCenter Data Validation Option Installation and User Guide describes how you can test and validate data across multiple data sources. It is written for database administrators and testers who are responsible for validating enterprise data. This guide assumes you have knowledge of the data sources and PowerCenter.

Informatica Resources
Informatica Customer Portal
As an Informatica customer, you can access the Informatica Customer Portal site at http://mysupport.informatica.com. The site contains product information, user group information, newsletters, access to the Informatica customer support case management system (ATLAS), the Informatica How-To Library, the Informatica Knowledge Base, the Informatica Multimedia Knowledge Base, Informatica Product Documentation, and access to the Informatica user community.

Informatica Documentation
The Informatica Documentation team takes every effort to create accurate, usable documentation. If you have questions, comments, or ideas about this documentation, contact the Informatica Documentation team through email at infa_documentation@informatica.com. We will use your feedback to improve our documentation. Let us know if we can contact you regarding your comments. The Documentation team updates documentation as needed. To get the latest documentation for your product, navigate to Product Documentation from http://mysupport.informatica.com.

Informatica Web Site


You can access the Informatica corporate web site at http://www.informatica.com. The site contains information about Informatica, its background, upcoming events, and sales offices. You will also find product and partner information. The services area of the site includes important information about technical support, training and education, and implementation services.

Informatica How-To Library


As an Informatica customer, you can access the Informatica How-To Library at http://mysupport.informatica.com. The How-To Library is a collection of resources to help you learn more about Informatica products and features. It includes articles and interactive demonstrations that provide solutions to common problems, compare features and behaviors, and guide you through performing specific real-world tasks.

viii

Informatica Knowledge Base


As an Informatica customer, you can access the Informatica Knowledge Base at http://mysupport.informatica.com. Use the Knowledge Base to search for documented solutions to known technical issues about Informatica products. You can also find answers to frequently asked questions, technical white papers, and technical tips. If you have questions, comments, or ideas about the Knowledge Base, contact the Informatica Knowledge Base team through email at KB_Feedback@informatica.com.

Informatica Multimedia Knowledge Base


As an Informatica customer, you can access the Informatica Multimedia Knowledge Base at http://mysupport.informatica.com. The Multimedia Knowledge Base is a collection of instructional multimedia files that help you learn about common concepts and guide you through performing specific tasks. If you have questions, comments, or ideas about the Multimedia Knowledge Base, contact the Informatica Knowledge Base team through email at KB_Feedback@informatica.com.

Informatica Global Customer Support


You can contact a Customer Support Center by telephone or through the Online Support. Online Support requires a user name and password. You can request a user name and password at http://mysupport.informatica.com. Use the following telephone numbers to contact Informatica Global Customer Support:
North America / South America Toll Free Brazil: 0800 891 0202 Mexico: 001 888 209 8853 North America: +1 877 463 2435 Europe / Middle East / Africa Toll Free France: 0805 804632 Germany: 0800 5891281 Italy: 800 915 985 Netherlands: 0800 2300001 Portugal: 800 208 360 Spain: 900 813 166 Switzerland: 0800 463 200 United Kingdom: 0800 023 4632 Standard Rate India: +91 80 4112 5738 Asia / Australia Toll Free Australia: 1 800 151 830 New Zealand: 09 9 128 901

Standard Rate Belgium: +31 30 6022 797 France: +33 1 4138 9226 Germany: +49 1805 702 702 Netherlands: +31 306 022 797 United Kingdom: +44 1628 511445

Preface

ix

CHAPTER 1

Introduction to Data Validation Option


This chapter includes the following topics:
Data Validation Option Overview, 1 Data Validation Workflow, 2 Architecture, 2 System Requirements, 3 Data Validation Methodology, 3

Data Validation Option Overview


Data Validation Option is a solution that you use with PowerCenter to validate data. You can validate target data to verify that it is accurate and the transformation process did not introduce error or inconsistencies. Data validation is the process to verify whether the moved or transformed data is complete and accurate and has not been changed because of errors in the movement or transformation process. Use PowerCenter Data Validation Option to verify that your data is complete and accurate. You might have the standard license or the enterprise license for Data Validation Option. If you have the enterprise license, you can use Data Validation Option in a production environment. If you have the standard license, you can use Data Validation Option only in a non-production environment. Data Validation Option with the enterprise license includes the following additional features in 9.5.0:
Parameterization Enhanced bad records storage Jasper Reports

Data Validation Option Users


There are many possible users of Data Validation Option:
Business or Data Analysts Data Warehouse Testers ETL Developers Database Administrators

Data Validation in an Enterprise


There are two types of data validation generally performed in a data integration setting, source to target comparisons and production to development comparisons. You can do the source to target validation at the end of development of a data integration project on the initial load of a data warehouse, or as reconciliation of the ongoing daily or incremental loads. You can perform data validation to compare production and development environments, when you upgrade data integration software or RDBMS database software. Finally, you can perform data validation as part of the testing process or as part of the production process, called the reconciliation or Audit/Balance/Control process. Data Validation Option supports all of the use cases described above. Data Validation Option reads table definitions from PowerCenter metadata repositories, and checks the data at either end of the process. It does not check the correctness of transformations or mappings. Data Validation Option identifies problems or inconsistencies but does not attempt to identify the source of the problem in the ETL process.

Data Validation Workflow


A typical workflow for data validation consists of multiple tasks. 1. 2. 3. Data Validation Option reads one or more PowerCenter metadata repositories. Define the validation rules in Data Validation Option. Run the rules to ensure the data conforms to the validation rules. When you do this, Data Validation Option performs the following tasks:
Creates and executes all tests through PowerCenter. Loads results into the Data Validation Option results database and displays them in the Data Validation

Option Client. 4. 5. Examine the results and identify sources of inconsistencies in the ETL process or the source systems. Repeat this process for new records.

Architecture
Data Validation Option requires installation and setup of PowerCenter. Source and target data table and file definitions are imported from PowerCenter repositories. You set up table pairs and test rules in Data Validation Option. This test metadata is stored in the Data Validation Option repository. When the tests are run, Data Validation Option communicates with PowerCenter through an API to create appropriate mappings, sessions, and workflows, and to execute them. PowerCenter connects to the data being tested instead of Data Validation Option. After the tests are executed, results are stored in the Data Validation Option repository and displayed in the Data Validation Option Client. You can configure Data Validation Option to authenticate users based on Informatica domain login credentials. After you enable Informatica authentication, the users must use their Informatica domain login credentials to use the Data Validation Option client.

Chapter 1: Introduction to Data Validation Option

System Requirements
The PowerCenter Client must be installed on the machine where Data Validation Option is installed. Any system that supports Informatica PowerCenter will support Data Validation Option. However, Data Validation Option works best on a machine that has at least 1GB of RAM.

Data Validation Methodology


A sample methodology to help you design a rigorous data validation process is presented in the following section. Most users have some kind of a testing process already in place. Usually, it is a combination of SQL code and Excel spreadsheets. A common temptation is to replicate the current SQL-based process. The first question often asked is, "How do I do this with Data Validation Option?" Use the following guidelines to set up a data validation approach: 1. 2. 3. Test data, not mappings or workflows. Your test framework should not contain parallel mappings, sessions, and workflows. Testing mappings is unit testing, which is different from data validation. Do not try to mimic SQL. Step back and think of what you are trying to accomplish. Data Validation Option can make things a lot easier. Assume the worst. If data needs to be moved from last_name to last_name, it may have been moved to city by mistake. If an IF statement was used, assume it was coded wrong. It is always prudent to assume a mistake has been made and be pleasantly surprised when tests return no errors. Do the easy things first. Complicated problems often manifest themselves in simple ways. Simple counts and constraints can point out some obvious errors. Design the initial test framework without taking performance into account. After you are satisfied with your approach, begin to optimize. Try to split complex SQL into more than one table pair. For example, if you see something like the following statements:
Select CASE (code='X', TableA.Fld1, TableB.Fld1) Select CASE (code='X', TableA.Fld2, TableB.Fld2)

4. 5. 6.

You can create two table pairs: Table A vs. Target WHERE clause A: code='X' Table B vs. Target WHERE clause A: code <> 'X' 7. Do not copy formulas from the ETL mapping into Data Validation Option. Sometimes when you need to test a complex transformation such as complex IF statements with SUBSTR, you might be tempted to just copy it from the mapping. This approach produces an obvious problem. If there is an error in the ETL mapping formula, you will replicate it in Data Validation Option, and Data Validation Option will not catch it. Therefore, you must always maintain a proper separation between ETL and testing. Do not try to do everything in Data Validation Option. If you think that a particular step can be accomplished more easily with SQL, use SQL. If you run 95% of your validation in Data Validation Option, and can document it with the audit trail, this is more than enough.

8.

Basic Counts, Sums, and Aggregate Tests


The goal of basic counts, sums, and aggregate tests are to make sure that all records were moved.

System Requirements

This approach detects the following problems:


Lack of referential integrity in the source. (Child with no parent will not be moved.) Row rejection by the target system. Incorrect ETL logic in the WHERE clauses. Other problems that do not move all the required records.

Approach data validation in the following order:


COUNT and COUNT_ROWS to count the number of records SUM for numeric fields COUNT DISTINCT to compare detail vs. aggregate tables

Check Referential Integrity of Target Tables


Check referential integrity of table tables to find lack of referential integrity in the target, either child without a parent or a fact record without corresponding dimension table. In a scenario where Table A is child table; Table B is parent table and Field A is child foreign key; Field B is parent primary key, approach data validation in the following order :
Test is SETA_in_B. (Every child FK is in parent PK.) In case of a star schema, fact table is child, dimension is parent. (Every fact FK needs to be in parent PK.) In case of composite keys, create an expression that concatenates the keys, and run these tests on that

expression.

Enforce Constraints on Target Tables


Often the errors in very complicated transformations manifest themselves in rather simple ways such as NULLs in the target, missing rows, or incorrect formats. You can test for such scenarios by enforcing constraints on target tables. This is one of the most overlooked yet most effective data testing strategies. The following examples explain target table constraints: Unique Primary Keys
UNIQUE(PK)

If a composite PK, use UNIQUE(expression). Valid Individual Values For example:


VALUE(FldA) Between 10, 50 VALUE(FldB) In ('A','B','C') VALUE(FldC) > 0 NOT_NULL(FldD) FORMAT(Phone) = 'Reg expression' (optional)

You can have more than one test on a specific field and you can create tests on an expression.

Chapter 1: Introduction to Data Validation Option

Aggregate Constraints These are often used for sanity checks in terms of rows moved, totals, etc. For example, is this correct?
Source (staging file) 1 laptop 1000 2 desktop 500 Target 1 laptop 1000 2 desktop 1500

This looks correct, but if this is XYZ companys daily sales, it is not correct, even though it was moved correctly. Somehow you know that XYZ sells more than $2500/day. Therefore, you can say that anything less than 1000 records and anything less than $2m and more than $15m is suspect. Therefore:
COUNT_ROWS(any fld) > 1000 SUM(Amount) Between 1000000,2000000

Compare Individual Records between Sources and Targets


If a field was moved without transformation or if it was transformed, comparing individual records between sources and targets ensures whether the value is correct. This is another critical step in testing. Read the section that explains the difference between VALUE and OUTER_VALUE tests and the expected results when using each test. Approach data validation in the following order :
Simple comparison. Create a table pair, join on a common keys, and then either set up tests automatically

(right-click/generate) or manually if field names are different.


Any row based expression (concatenation, calculation) can be tested similarly, for example: VALUE(first_name || '_' || last_name || '@dvosoft.com' = email)

Data Validation with Views


You can use lookup views , sql views, and join views to create complex data validation scenarios. SQL Views After you complete the other data validation methods, you can use Data Validation Option SQL views to construct complicated SQL-based scenarios that involve multiple tables and complicated transformations. Lookup Views Testing lookups is an important step in data testing. Data Validation Option lookup views allow you to test the validity of the lookup logic in your transformation layer. Join Views You can create complex join relationships between heterogeneous data sources in a join view. You can test the validity of data across related tables with join views.

Data Validation Methodology

CHAPTER 2

New Features and Behavior Changes


This chapter includes the following topics:
New Features and Enhancements, 6 Behavior Changes, 10

New Features and Enhancements


This section contains information about new enhancements in different versions of Data Validation Option.

New Features and Enhancements in 9.5.0


In 9.5.0, Data Validation Option contains multiple new features and enhancements.

Authentication
LDAP Authentication Administrators can configure LDAP authentication on the Informatica domain for a Data Validation Option schema with PowerCenter 9.0.1 or later.

Licensing
Data Validation Option is available with a standard license and an enterprise license. You must enter the license key after you install Data Validation Option.

Command Line Utility


InstallTests When you run the InstallTests command you can use the forceinstall option to recreate mappings for the tests that are already installed.

Reports
Jasper Reports If you have the enterprise license, you can generate different reports and dashboards with the JasperReports Server.

Repository
Metadata Manager Integration You can view the details of the PowerCenter repository objects in Metadata Manager.

SQL Views, Lookup Views, and Join Views


Expressions in Join Conditions You can enter and validate PowerCenter expressions as fields in a join condition when you configure a join view. Expression Validation You can validate the PowerCenter expressions that you enter in the Data Validation Option Client when you configure a source to lookup relationship in a lookup view.

Table Pairs and Single Tables


Enhanced Bad Records Storage If you have the enterprise license, you can store up to 16 million bad records in the Data Validation repository or as a flat file for each test. Expression Validation You can validate the PowerCenter expressions that you enter in the Data Validation Option Client when you configure table pairs and single tables. Parameterization If you have the enterprise license, you can configure a parameter file and use the parameters in the WHERE clause of a table pair or single table. Copy Objects You can copy table pairs and single tables in a folder to another folder.

Tests
Expression Validation You can validate the PowerCenter expressions that you enter in the Data Validation Option Client when you configure test conditions and expression definitions. Automatic Test Generation You can generate tests based the position of columns of the tables in a table pair.

New Features and Enhancements in 9.1.4.0


In 9.1.4.0, Data Validation Option contains multiple new features and enhancements.

Authentication
Informatica Authentication Administrators can configure Informatica authentication for a Data Validation Option schema with PowerCenter 9.0.1 or later. After you enable Informatica authentication, users must use Informatica domain credentials to log in to the Data Validation Option Client.

New Features and Enhancements

Repositories
Data Sources You can use the following data sources in Data Validation Option:
PowerExchange for DB2 z/OS Netezza SAP Salesforce.com SAS

Single Tables and Table Pairs


Threshold You can enter the threshold for aggregate and value tests in table pairs and single tables as a percentage value in addition to the absolute value. Max Bad Records You can enter a percentage value for the maximum number of bad records for tests in table pairs and single tables in addition to the absolute value. Number of Processed Records After you run a test, the test report displays the details of the processed records. You can view the number of bad records, total number of records processed through the join, and the number of records read from the database. Join Views You can create an object with complex join conditions in multiple heterogeneous data sources. You can use a join view as a table in table pairs and single tables.

New Features and Enhancements in 9.1.2.0


In 9.1.2.0, Data Validation Option contains multiple new features and enhancements.

Command Line Utility


RunTests Command When you run tests with the RunTests command, you can use the following options:
Send an email with the test results once Data Validation Option completes the test. Provide cache memory setting for mapping transformations.

InstallTests Command When you install tests with the InstallTests command, you can provide the cache memory setting for mapping transformations

Tests
Automatic Test Generation You can generate count tests along with value tests in automatic test generation.

Chapter 2: New Features and Behavior Changes

Repositories
Data Sources Data Validation Option supports the following data sources through PowerExchange for ODBC:
DB2 z/OS DB2 AS/400 IMS Adabas VSAM Mainframe flat files

Metadata Import Significant performance improvement when you import or refresh metadata from a PowerCenter repository.

New Features and Enhancements in 9.1.0


Data Validation Option version 9.1.0 contains new features and enhancements. Client Layout
Folders. You can organize single tables and table pairs by placing them in folders. When you upgrade

from version 3.1, the installation program creates a Default folder for each user and places the table pairs and single tables in the folder. When you create a new user, Data Validation Option creates a Default folder.
Error reporting. If a test fails to run, Data Validation Option displays the test run error on the Tests tab.

Previously, you had to examine the PowerCenter session log file to view test run errors.
Single Tables tab. The details area contains separate tabs for table pairs and single tables.

Command Line Utility


New Commands. The Data Validation Option command line utility, DVOCmd.exe, contains new commands

that allow you to create users and refresh repositories. PowerCenter Version
PowerCenter 9.1.0. Data Validation Option 9.1.0 works with PowerCenter versions 8.5 and later, except

for PowerCenter version 9.0. Reports


Reports for tests in folders. You can run reports for all tests in a folder. Report information. Reports display the folder name, error messages, join expressions, and conditions, if

applicable. Repositories
Refreshing repositories. When you refresh a repository, you can refresh the entire repository, the

connection objects, the folders, or the sources and targets. You can also refresh repository folders individually. Single Tables and Table Pairs
Expressions in join conditions. When you join two tables in a table pair, you can enter a PowerCenter

expression as a field in the join condition. Enter an expression to join tables with key fields that are not identical.

New Features and Enhancements

Large table processing. When you include a large table in a table pair, you can optimize the way Data

Validation Option joins table data. You can specify which table Data Validation Option uses for the master or detail table. You can also use sorted output for the join.
Pushing sorting logic to the database. To increase the performance of table pair and single table tests, you

can push the sorting logic for joins to the source database. Pushing sorting logic to the database causes the database to sort records before it loads them to PowerCenter which minimizes disk input and output. Tests
Filter conditions. You can apply a filter condition to table pair and single table tests. If you apply a filter

condition to a table pair test, Data Validation Option applies the filter condition after it joins the tables in the table pair.

Behavior Changes
This section contains information on behavior changes in different versions of Data Validation Option.

Behavior Changes in 9.1.4.0


Effective in 9.1.4.0, Data Validation Option behavior changes in multiple ways.

Client Layout
SQL Views Tab and Lookup Views Tab You can no longer add SQL views and lookup views to a table pair or single table from the right-click menu in SQL Views Tab and Lookup Views Tab.

Repositories
PowerCenter Repository Support You can use repositories from PowerCenter 8.6.1 HotFix 10 and later. Previously, you could use repositories from PowerCenter 8.5 and later.

Tests
Automatic Test Generation You select whether to enable trim trailing spaces for the tests that you generate automatically. Previously, you had to manually update the automatically generated tests to enable trim trailing spaces. You can also apply separate data source sorting for the tables in the table pair. Previously, you could not provide separate sorting for the tables.

Behavior Changes in 9.1.2.0


Effective in 9.1.2.0, Data Validation Option behavior changes in multiple ways.

Single Tables and Table Pairs


Database connection Name of the database connection that you provide in Data Validation Option table pairs and single tables are no longer case sensitive. Previously, if you edit the database connection name to a different case in PowerCenter, the existing table pairs and single tables would be invalidated.

10

Chapter 2: New Features and Behavior Changes

Tests
Autogeneration of Tests Use Compare Table menu item to autogenerate of table pairs and tests between tables and flat files in folders and contains several options. Previously, you had to select the two folders and right-click to autogenerate table pairs and tests.

Behavior Changes in 9.1.0


Effective in 9.1.0, Data Validation Option behavior changes in multiple ways.

Client Layout
Data Sources tab The Data Sources tab is removed. Folders Table pairs and single tables appear in the Default folder in the Navigator. Previously, single tables and table pairs appeared in the Single Tables and Table Pairs nodes in the Navigator. Properties area The Properties area is moved to the bottom right side of the Data Validation Option Client to make more room for the Navigator. Previously, the Properties area appeared in the bottom left side of the screen. Results tab The tab that lists bad records for tests is renamed to Results. The Results tab displays test summary information for table pairs, single tables, and tests. It also displays bad records for certain types of tests. Previously, Data Validation Option displayed the Details tab only for tests. It displayed bad records only. Single Tables tab In the details area, single tables are listed on the Single Tables tab. Previously, single tables were listed on the Table Pairs tab.

Installation
Executable name The Data Validation Option executable file name is DVOClient.exe. Previously, the executable file name was DataValidator.exe. Installation directory The Data Validation Option default installation directory is C:\Program Files\Informatica<version>\DVO on 32bit operating systems and C:\Program Files (x86)\Informatica<version>\DVO on 64-bit operating systems. Previously, the default installation directory was C:\Program Files\DVOSoft.

Reports
Reports for single-table constraints Reports for single-table constraints display information about the single table only. Previously, reports for single-table constraints were the same as reports for table pairs except they displayed Table B values as null values.

Behavior Changes

11

Repositories
Importing metadata When you add a PowerCenter repository, Data Validation Option imports folder names. If the repository is a target repository, Data Validation Option also imports connection metadata. To import source and target metadata, you must refresh the repository. Previously, when you saved a repository for the first time, Data Validation Option imported folder names and all source and target metadata. If the repository was a target repository, Data Validation Option also imported connection metadata. Refreshing repositories When you refresh a repository, you do not have to close and restart Data Validation Option. Previously, you had to restart Data Validation Option after you refreshed a repository.

Single Tables and Table Pairs


Single Tables Editor You create single tables using the Single Table Editor dialog box. Previously, you created single tables using the Table Pairs Editor dialog box.

SQL and Lookup Views


SQL view definition When you create an SQL view, you can select the tables and columns to use in the view. Data Validation Option detects the datatype, precision, and scale of the columns. Previously, when you created an SQL view, you had to define the SQL relationships across tables, define the columns, and define the datatype, precision, and scale of each column manually.

12

Chapter 2: New Features and Behavior Changes

CHAPTER 3

Data Validation Option Client Layout


This chapter includes the following topics:
Data Validation Option Client Overview , 13 Data Validation Option Client Tabs, 14 Folders, 16 Menus, 18

Data Validation Option Client Overview


The Data Validation Option Client contains multiple areas and menus that allow you to perform different tasks. Statistics Area The statistics area appears when you click on the Data Validation Option user. This area displays information about the number of repositories, table pairs, single tables, tests, views, and data sources that exist in the current instance of Data Validation Option. It also displays information about running tests and the user name. Navigator The Navigator is on the left side of the Data Validation Option Client. It contains the following objects:
Object INFA Repositories Description Lists all PowerCenter repositories that you add to Data Validation Option. Expand a repository to see the repository folders and the sources and targets in each folder. Lists the SQL views that you create. Lists the lookup views that you create. Lists the join views that you create. Lists the single tables and table pairs that you create.

SQL views Lookup views Join views Folders

13

Object Table Pairs Single Tables

Description Lists all the table pairs that you create. Lists all the single tables that you create.

Details Area The details area is in the upper right section of the Data Validation Option Client. It contains tabs that display details about the objects you create in Data Validation Option such as tests, table pairs, single tables, and views. When you click on a folder, table pair, or single table, the details area displays the following information about the tests associated with the object:
Number of tests. Number of tests passed. Number of tests failed. Number of tests in progress. Number of tests not run because of errors. Number of tests not run by the user.

Properties Area The Properties area is in the lower right section of the Data Validation Option Client. It displays the properties for the object that you select in the Navigator or details area. Results Area The Results area appears in the lower right section of the Data Validation Option Client when you select a test, table pair, or single table in the details area. The Results area displays test summary information, results, and the bad records written to the Data Validation Option repository. Status Bar The status bar appears in the below the Results Area and displays the number of tests in progress and the number of tests in queue to be run.

Data Validation Option Client Tabs


The Data Validation Option Client contains tabs that display different information. The Data Validation Option Client contains the following tabs:
Tests Table Pairs Single Tables SQL Views Lookup Views Join Views

14

Chapter 3: Data Validation Option Client Layout

Tests Tab
The Tests tab in the details area displays all tests set up in this instance of Data Validation Option. By default, tests are sorted in the order they were created. However, you can sort tests by clicking the column header. The following table describes the columns on the Tests tab:
Column Test status icon Description Indicates whether tests associated with the table pair have been run and the status of the most recent run. If you hold the pointer over the icon, Data Validation Option displays the meaning of the icon. The test description. Type of test. The name of the table pair or single table. The date and time that the tests were last run. If a test failed, this column lists the error

Name Test type Table Pair /Single Table Test Run Date/Time Test Run Error

SQL Views Tab


The SQL Views tab in the details area displays all SQL views set up in this instance of Data Validation Option. By default, SQL views are sorted in the order they were created. However, you can sort SQL views by clicking the column header. The following table describes the columns on the SQL Views tab:
Column Description Table Name SQL Statement Description SQL view description. Tables you use to create the SQL view. SQL statement that you run against the database to retrieve data for the SQL view.

The right-click menu in the SQL View tab lists the following options:
Add SQL View Edit SQL View Delete SQL View Export Metadata

Lookup Views Tab


The Lookup Views tab in the details area displays all lookup views set up in this instance of Data Validation Option. By default, lookup views are sorted in the order they were created. However, you can sort lookup views by clicking the column header.

Data Validation Option Client Tabs

15

The following table describes the columns on the Lookup Views tab:
Column Description Source Table Lookup Table Description Lookup view description. Source table name. Lookup table name.

The right-click menu in the Lookup Views tab lists the following options:
Add Lookup View Edit Lookup View Delete Lookup View Export Metadata

Join Views Tab


The Join Views tab in the details area displays all join views set up in this instance of Data Validation Option. By default, join views are sorted in the order they were created. However, you can sort Join views by clicking the column header. The following table describes the columns on the Join Views tab:
Column Description Joined Tables Description Join view description. List of tables joined in the join view.

The right-click menu in the Join Views tab lists the following options:
Add Join View Edit Join View Delete Join View Export Metadata

Folders
Folders store the single tables and table pairs that you create. By default, Data Validation Option places the single tables and table pairs that you create in the default folder. If you create a folder, you can create single tables or table pairs within the folder. You can move single tables or table pairs between folders. Within a folder, you can expand a single table or table pair to view the tests associated with it. Folder names are case sensitive.

16

Chapter 3: Data Validation Option Client Layout

You can also copy folders. You can copy the contents of a folder in your workspace to a different folder in your workspace or to a folder in another user workspace. You must copy folder contents to a new folder. You cannot copy folder contents to a folder that exists in the target workspace. When you copy a folder, Data Validation Option copies all table pairs, single tables, and test cases in the source folder to the target folder. Data Validation Option does not copy test runs or the external IDs associated with table pairs or single tables. If a table pair or single table in the source folder uses an SQL view or a lookup view, Data Validation Option copies the view to the target user workspace unless the workspace contains a view with the same name. If the target workspace contains a view with the same name, Data Validation Option gives you the following options:
You can use the view in the target workspace. You can copy the view to the target workspace with another name. Data Validation Option names the view in

the target workspace "Copy <number> <source view name>." Before Data Validation Option copies a folder, it verifies that the repository and all data sources associated with the objects to copy exist in the target workspace. Object names in Data Validation Option are case sensitive. Therefore, the repository, data sources, and folders that contain the data sources must have identical names in the source and the target workspaces. If the repository or any required data source does not exist in the target workspace, Data Validation Option does not copy the folder.

Copying Folders
You can copy the contents of a folder in your workspace to a different folder in your workspace or to a folder in another user workspace. 1. Select Edit > Copy Folder. The Copy Folder Contents dialog box opens. 2. Enter the following information:
Property Copy from User Copy from Folder Copy to User Description Name of the source user. Data Validation Option copies the folder in this user workspace. Name of the source folder. Data Validation Option copies this folder. Name of the target user. Data Validation Option copies the folder to this user workspace. The source user and the target user can be the same user. Name of the target folder. The target folder must be unique in the target workspace.

Copy to Folder

3.

Click OK.

Copying Objects
You can copy the objects in a folder in your workspace to a different folder in your workspace or to a folder in another user workspace. 1. Select the object that you want to copy. You can also select multiple objects which are available in different folders that are of the same type. 2. Right-click the object and select Copy. The Copy Object(s) dialog box appears.

Folders

17

3.

Enter the following information:


Property User Description Name of the target user. Data Validation Option copies the folder to this user workspace. The source user and the target user can be the same user. Name of the target folder. The target folder must be unique in the target workspace.

Folder

Menus
The following table describes the Data Validation Option menu items:
Menu File Menu Item New Metadata Settings Definition Create a new Data Validation Option object. Import, export, and reload metadata. Configure preferences and opens the Data Validation Option folder in the Documents and Settings directory in Windows Explorer. Exit the Data Validation Option Client. Edit the selected object. Delete the selected object. Copy the selected folder. Move tables/table pairs to the selected folder. Run selected tests. Add a new test. Generate value tests for the selected object. Autogenerate tests for all the table pairs. Generate a consolidated test report. Launches dashboard. The feature is available only for enterprise customers. Refresh contents of all the repositories. Refresh all contents in the selected repository. Refresh the list of folders in the selected repository. Refresh the sources and targets in the selected repository.

Exit Edit Edit Delete Copy folder Move tables/table pair Action Run Tests Add Test Generate Value Tests Compare Tables Generate Report Dashboard Refresh All Repositories Everything Folder List Folder (Sources and Targets)

18

Chapter 3: Data Validation Option Client Layout

Menu

Menu Item Connections

Definition Refresh all connections in the selected repository. Launches the Home dashboard. Launches the Repository dashboard. Launches the Home dashboard. Launches the Home dashboard. Opens the help file. Displays information about PowerCenter Data Validation Option. Click the dialog box to close it. Opens the Change License Key dialog box.

Dashboards

Home Repository Details Folder Details Table Details

Help

Help About

Change License Key

Note: You can see the Dashboards menu and menu items if you have the enterprise license.

Settings Folder
When you select File > Settings > Open Settings Folder , Windows Explorer displays the contents of the Data Validation Option folder in the Documents and Settings directory for that installation of the application. The data folder also contains an XML file that contains the information entered in the Preferences dialog box.

Menus

19

CHAPTER 4

Data Validation Option Installation and Configuration


This chapter includes the following topics:
Data Validation Option Installation and Configuration Overview , 20 Prerequisites , 21 System Permissions, 21 Information Required for Installation, 22 Installing and Configuring Data Validation Option for the First Time, 22 Data Validation Option Configuration for Additional Users, 26 Data Validation Option Upgrade, 27 Upgrading from Version 9.1.x to Version 9.5.0, 27 Upgrading from Version 3.0 , 27 Upgrading from Version 3.1 , 28 DVOCmd Installation on UNIX, 29 Environment Variables, 30 Modifying the License Key, 31 JasperReports Server Setup, 31

Data Validation Option Installation and Configuration Overview


You must install Data Validation Option client before you can create and run data validation tests. To install or upgrade Data Validation Option, complete the following tasks: 1. 2. 3. 4. Review the prerequisites. Review the required system permissions. Install or Upgrade Data Validation Option. Perform the Data Validation Option setup steps. After you install Data Validation Option, run a test to verify that the installation was successful.

20

Prerequisites
You must complete the prerequisites to successfully install Data Validation Option. Before you install Data Validation Option, complete the following prerequisites:
Install PowerCenter 8.6.1 HotFix 10 or later on the same local area network as the Data Validation Option

Client machine.
The Informatica domain must contain at least one PowerCenter Integration Service. Install PowerCenter Client on the same machine where Data Validation Option will be installed. Setup at least one PowerCenter repository. Obtain the license file to use Data Validation Option. Data Validation Option has a separate license file. You

cannot use the license file for PowerCenter on Data Validation Option.

PowerCenter Support for Data Validation Option Features


You must install PowerCenter 8.6.1 HotFix 10 or later to use Data Validation Option 9.5.0. Certain features in Data Validation Option requires a later version of PowerCenter. The following table lists the Data Validation Option features and the supported PowerCenter versions:
Data Validation Option Feature SAP R/3 data source support SAS data source support Informatica Authentication Supported PowerCenter Version PowerCenter 9.1.0 and later PowerCenter 9.1.0 and later PowerCenter 9.0.1 and later

DVOCmd in Data Validation Option 9.1.0 and later requires PowerCenter 8.6.1 HotFix 10 or later. Prior versions of DVOCmd works with previous versions of PowerCenter 8.6.1.

System Permissions
You require certain system permissions to complete Data Validation Option installation and configuration. To complete Data Validation Option setup, verify that you have the permission to complete the following tasks:
Create a database, including the ability to create schemas, tables, indexes, sequences, and views. Create a PowerCenter connection object in the Workflow Manager. Create a folder in a PowerCenter repository in the Repository Manager. Create and associate a PowerCenter Integration Service with the PowerCenter repository that the Data

Validation Option user can access.


Copy a JAR file onto the machine that hosts Informatica Services. Configure the Administrator tool. Modify the environment variables on the machine where you install Data Validation Option. Read and write on the Data Validation Option installation directory and subdirectories.

Prerequisites

21

Information Required for Installation


Before you install Data Validation Option, gather the information that you need during installation. Complete the following table with the values that you need to complete Data Validation Option setup:
Name Informatica Domain PowerCenter Integration Service PowerCenter Repository Service PowerCenter Repository user name PowerCenter Repository password Location of the domains.infa file on the client machine Value

Installing and Configuring Data Validation Option for the First Time
When you install and configure Data Validation Option Client for the first time, you must configure the PowerCenter repository that holds the mappings and sessions for the data validation tests. 1. Verify that the user for the Data Validation Option repository have the privileges to create and modify tables, indexes, sequences, and views during installation. User must have these privileges to create a Data Validation Option repository. Note: If the Data Validation Option repository database is IBM DB2, the user name and schema name must be the same. Configure the page size in IBM DB2 to a minimum of 16KB. You cannot install the Data Validation Option repository on a clustered IBM DB2 system. 2. 3. Open the Administrator tool and create a PowerCenter Repository Service to store the Data Validation Option mappings. You can also use an existing PowerCenter Repository Service. Verify that the code page set for the PowerCenter Integration Service is compatible with the NLS setting of the Data Validation Option repository database. If the settings are not compatible, test results might be inaccurate. 4. Open the Workflow Manager and set up a connection to the Data Validation Option repository. Every Data Validation Option user must have the permission to use this connection. Record the connection name: ________________________________________________________________________________ 5. Open the Repository Manager and create a folder in the repository for Data Validation Option to store mappings that run tests. Use this folder only for storing Data Validation Option mappings. Every Data Validation Option user must have the privileges to use this folder. Record the repository and folder names.

22

Chapter 4: Data Validation Option Installation and Configuration

Repository name: ________________________________________________________________________________ Folder name: ________________________________________________________________________________ 6. Verify that the domains.infa file is available in the following location: <PowerCenter installation directory> \clients\PowerCenterClient\. The domains.infa file contains the Informatica domain and PowerCenter repository details. You must connect to the Informatica domain and the PowerCenter repository from the PowerCenter client tools to update the domains.infa file with the domain and repository details. If the domains.infa file is not available on the PowerCenter Client machine, copy the file from the following location on the PowerCenter server machine: <Informatica Services installation directory>\<version> 7. On the Data Validation Option Client machine, create an environment variable called INFA_HOME and set the value to the location of the domains.infa file: a. b. c. d. e. 8. Select Control Panel > System > Advanced > Environment Variables . Click New System Variable. Enter INFA_HOME for the variable name. Enter the domains.infa file path, excluding the domains.infa filename, for the variable value. Click OK in each dialog box.

Verify that the environment variable is set up correctly: a. b. Open the DOS command window and type set. The environment variable that was just set up should appear in the list of environment variables. It should read as follows: INFA_HOME = C:\Informatica\<version>\clients\PowerCenterClient\ Configure the variable, if the environment variable is not set.

9. 10.

Install Data Validation Option on the client machine. Create a folder on the machine that hosts Informatica Services and copy the dvoct.jar file from the C:\Program Files< (x86)>\Informatica<version>\DVO\powercenterlibs directory on the Data Validation Option Client to the new folder. Ensure that the PowerCenter Integration Service can access the location. Update the Java SDK Classpath for the PowerCenter Integration Service: a. b. c. d. e. f. Open the Administrator tool. From the navigator, select the PowerCenter Integration Service. Click the Processes tab. Edit the Service Process Properties > General Properties. Edit the Java SDK Classpath. Enter the path to the dvoct.jar file on the machine that hosts the Informatica Services, including the dvoct.jar file name. If there is a value in Java SDK Classpath, add a semi-colon (Windows) or colon (UNIX/Linux) after the classpath before you add the dvoct.jar file path.

11.

If PowerCenter is installed in a grid environment, repeat this step for each node. 12. Run the Data Validation Option Client.

Installing and Configuring Data Validation Option for the First Time

23

13.

Enter the Data Validation Option repository information: a. In Data Validation Option, select File > Settings > Preferences > Data Validation Option . You can also right-click on the Data Validation Option user in the Navigator to open the Preferences dialog box. b. Enter the following information:
Option User Database Type Database Driver Description Enter a unique user name. Select Oracle, SQL Server, or IBM DB2. This value is automatically populated by Data Validation Option. It does not need to be changed. The value automatically populated by Data Validation Option for this field consists of a series of values. There are placeholders for the database host (server) name, database name, and port number, if appropriate. Remove the characters '<' and '>' when you enter the value. Note: If Oracle RAC is used, the URL must be in the following format:
jdbc:oracle:thin:@(DESCRIPTION=(LOAD_BALANCE=on) (ADDRESS=(PROTOCOL=TCP) (HOST=host1) PORT=1521))(ADDRESS=(PROTOCOL=TCP)(HOST=host2) (PORT=1521)) (CONNECT_DATA=(SERVICE_NAME=service)))

Database URL

See tnsnames.ora for the exact syntax. Database User Database Password Enter the database user name. Enter the database user password.

c. d. 14.

Click Test to make sure the database information is correct. Click Save, and create the Data Validation Option repository schema when prompted.

Optionally, update the mapping properties based on the requirements and the environment: a. b. Select File > Settings > Preferences > Mapping Properties. Enter the following information:
Option Max Bad Records for Reporting Description Maximum number of bad records for reporting written to the Data Validation Option repository for each test. The maximum value you can enter is 1000. Default is 100. DTM Buffer Size Amount of memory allocated to the PowerCenter session from the DTM process. Default is Automatic. Data Validation Option uses the buffer size that you configure in PowerCenter if you do not enter a value. You can increase the DTM buffer size if the tests contain a large number of table pairs.

24

Chapter 4: Data Validation Option Installation and Configuration

Option

Description You can specify a numeric value. If you enter 2000, the PowerCenter Integration Service interprets the number as 2000 bytes. Append KB, MB, or GB to the value to specify other units. For example, you can specify 512MB.

Max Concurrent Runs

Maximum number of PowerCenter sessions run at the same time. Each table pair is run as one session, regardless of how many tests it has. Default is 10. The maximum value you can enter is 50.

c.

If you have the enterprise license, you can configure the following detailed error rows analysis settings:
Max Bad Records for Detailed Analysis. Maximum number of bad records stored for detailed error

record analysis. Data Validation Option stores up to 16,000,000 records for error record analysis. Default is 5000.
File delimiter. Delimiter character to separate the error records if you choose to store the bad records

in a file. 15. Restart the Data Validation Option Client. The Data Validation Option Client prompts you to enter the license key. 16. Click Browse and select the license key. Data Validation Option Client prompts you to restart the application. Data Validation Option stores the license information in the Data Validation Option database schema. 17. Add the repository that contains the folder where Data Validation Option creates mappings: a. Right-click INFA Repositories in the Navigator, and select Add Repository. The Repository Editor dialog box opens with the following options:
Option Name Description Name for the repository, ideally with the word target to identify the repository type. Location of the pmrep.exe file on the client machine using the Browse button. Typically the location is: C:\Informatica\<version>\clients \PowerCenterClient\client\bin. The PowerCenter version that runs Data Validation Option. Name of the Informatica domain. Name of the PowerCenter repository. User name for the PowerCenter repository. User password for the PowerCenter repository. LDAP security domain. Leave this field blank if you use native authentication. Select true. You can configure only one PowerCenter repository with target folder for a Data Validation Option schema.

Client Location

PowerCenter Version PowerCenter Domain Repository User name Password Security Domain Contains Target Folder

Installing and Configuring Data Validation Option for the First Time

25

Option Target Folder Integration Service Data Validation Option Results Warehouse Connection Enable Metadata Manager Is secure connection

Description Enter the folder name defined. Enter the name of the PowerCenter Integration Service. Enter the PowerCenter connection to the Data Validation Option repository that you created in Step 5. Enable or disable integration with the Metadata Manager service. Select the checkbox if the Metadata Manager service runs over a secure connection. Enter the server host name where Metadata Manager service runs. Enter the Metadata Manager service port. Enter the name of the PowerCenter resource.

Server Host Name Server Port Resource Name

b.

Click Save. Data Validation Option offers to test the repository settings to make sure they are correct.

c.

Test the repository settings to ensure that the settings are accurate. When you confirm the settings, Data Validation Option imports the PowerCenter sources, targets, and connections.

18. 19.

Optionally, you can add other repositories. Make sure that Contains Target Folder is false because only one repository can have a target folder. Create a table pair with one test and run it to make sure the installation was successful.

RELATED TOPICS:
Troubleshooting on page 114

Data Validation Option Configuration for Additional Users


After you configure Data Validation Option for the first user, you can configure additional users on other client machines. 1. 2. 3. 4. On the Data Validation Option Client machine, create an environment variable INFA_HOME and set the value to the location of the domains.infa file. Install the Data Validation Option Client. Start the Data Validation Option Client. The Data Validation Option Client prompts you to configure the repository. Configure the Data Validation Option repository. You can use the details of an existing Data Validation Option repository.

26

Chapter 4: Data Validation Option Installation and Configuration

5. 6.

Add the PowerCenter repository for Data Validation Option. Optionally, configure additional PowerCenter repositories.

Note: You can also use the command CreateUserConfig to create additional Data Validation Option users.

RELATED TOPICS:
Command Line Integration on page 103

Data Validation Option Upgrade


You can upgrade Data Validation Option 3.0 and 3.1 to Data Validation Option 9.x. Upgrade to version 9.x does not affect test metadata or test results. Back up Data Validation Option repository before you upgrade.

Upgrading from Version 9.1.x to Version 9.5.0


You can upgrade to the latest version of Data Validation Option from Data Validation Option 9.1.x. 1. 2. 3. 4. 5. Uninstall the Data Validation Option version 9.1.x Client. Install the latest version of the Data Validation Option Client. From the command line go to the Data Validation Option installation folder. Enter the DVOCmd UpgradeRepository command, if this is the first upgrade of the client in a multi-user environment. On the Data Validation Option Client machine, create an environment variable called INFA_HOME and set the value to the location of the domains.infa file, if the variable does not exist. The domains.infa file is available in the following location: <PowerCenter installation directory>\clients
\PowerCenterClient\

6. 7.

Restart the Data Validation Option Client. Run the Data Validation Option Client The Data Validation Option Client prompts you to enter the license key.

8.

Click Browse and select the license key. Data Validation Option Client prompts you to restart the application. Data Validation Option stores the license information in the Data Validation Option database schema.

Upgrading from Version 3.0


You can upgrade from Data Validation Option 3.0. 1. In Data Validation Option version 3.0, go to Tools > Properties, and note the name of the connection to the Data Validation Option repository.

Data Validation Option Upgrade

27

2. 3. 4.

Uninstall Data Validation Option version 3.0. Install Data Validation Option. From the command line go to the Data Validation Option installation folder. To find the Data validation Option installation folder, right-click the Data Validation Option button and click Open File Location .

5. 6. 7.

Enter the DVOCmd UpgradeRepository command, if this is the first upgrade of the client in a multi-user environment. On the Data Validation Option Client machine, create an environment variable called INFA_HOME and set the value to the location of the domains.infa file. Optionally, edit C:\Program Files< (x86)>\Informatica<version>\DVO\config\JMFProperties.properties and change the number of pmrep processes from 2 to 8. Provide a higher number of pmrep processes to increase performance when you run several tests simultaneously.

8. 9.

Start Data Validation Option. Edit the following repository information:


PowerCenter version Name of the connection to the Data Validation Option repository.

10.

Click Save. A prompt appears and asks you to verify the test settings.

11. 12. 13.

Right-click the repository name in the Navigator and click Refresh. Restart Data Validation Option. Run the Data Validation Option Client The Data Validation Option Client prompts you to enter the license key.

14.

Click Browse and select the license key. Data Validation Option Client prompts you to restart the application. Data Validation Option stores the license information in the Data Validation Option database schema.

Note: The database URL format has changed since version 3.1. If Data Validation Option fails to upgrade the database URL, select File > Settings > Preferences > Data Validation Option , and update the Database URL.

Upgrading from Version 3.1


You can upgrade from Data Validation Option 3.1. 1. 2. 3. Uninstall Data Validation Option version 3.1. Install Data Validation Option. From the command line go to the Data Validation Option installation folder. To find the Data validation Option installation folder, right-click the Data Validation Option button and click Open File Location . 4. Enter the DVOCmd UpgradeRepository command, if this is the first upgrade of the client in a multi-user environment.

28

Chapter 4: Data Validation Option Installation and Configuration

5.

On the Data Validation Option Client machine, create an environment variable called INFA_HOME and set the value to the location of the domains.infa file. The domains.infa file is available in the following location: <PowerCenter installation directory>\clients \PowerCenterClient\.

6.

Optionally, edit C:\Program Files< (x86)>\Informatica<version>\DVO\config\JMFProperties.properties and change the number of pmrep processes from 2 to 8. Provide a higher number pmrep processes to increase performance when you run tests simultaneously.

7. 8.

Restart Data Validation Option. Run the Data Validation Option Client The Data Validation Option Client prompts you to enter the license key.

9.

Click Browse and select the license key. Data Validation Option Client prompts you to restart the application. Data Validation Option stores the license information in the Data Validation Option database schema.

Note: The database URL format has changed since version 3.1. If Data Validation Option fails to upgrade the database URL, select File > Settings > Preferences > Data Validation Option , and update the Database URL.

DVOCmd Installation on UNIX


DVOCmd is a command line program that you can use to run Data Validation Option tasks without Data Validation Option Client. You can install DVOCmd on a UNIX machine and run Data Validation Option commands through the shell. You do not need to run Informatica services on the machine. You must install and configure Informatica services in the network before you can use DVOCmd in UNIX. The installation is required for the command line programs such as pmrep and pmcmd, and libraries used by DVOCmd. To install DVOCmd on UNIX, untar the installer package (install-dvo-<version>.tar) in a location with read-write permission. You can find the .tar file inside the .zip package (Install_DataValidator_<version>.zip) that Informatica provides. For example, you can untar the DVOCmd package in the user home directory. DVOCmd and the associated files are available in the folder DVO.
cd $HOME tar xvf install-dvo-9.5.0.0.tar

After you install DVOCmd, you must set the INFA_DOMAINS_FILE environment variable to the location of the domains.infa file. Copy the domains.infa file from a Windows machine running the PowerCenter Client.

DVOCmd Installation on UNIX

29

Environment Variables
You can configure environment variables on the machine where you installed Data Validation Option. The following tables describes the environment variables used by Data Validation Option:
Environment Variable DV_DOUBLE_COMPARISON_EPSILON Description Tolerance value for floating point numbers. Default is 1e-15. The tolerance value does not affect join keys. The variable is applicable only for the 'equal to' operator. Controls automatic cache size setting for the data validation mapping. Set the value to "Y" to set the cache size for data validation mapping to auto. If you don't set the variable, Data Validation Option uses the default cache size. Default is 20 MB for the data cache and 10 MB for the index cache. You can also set the cache size when you run the InstallTests and RunTests DVOCmd commands with the option --cachesize. DV_REPORT_ENGINE Controls the BIRT reporting engine. Set the value as "Y" to turn the reporting engine on and "N" to turn the reporting engine off. You may need to set the environment variable when you face issues with BIRT reporting engine in Citrix environment. DV_MAPPING_STRING_MAX_PRECISION DV_RTRIM_JOIN_KEYS Precision of string fields in data validation mappings. Controls removal of trailing spaces from join keys. Set the value as "Y" to remove trailing spaces from join keys. PC_CLIENT_INSTALL_PATH Location of the pmrep command line program. You must set the variable to run DVOCmd from the UNIX command line. Location of the domains.infa file. You must set the variable to run DVOCmd from the UNIX command line. Location of the user configuration directory. Default is <Windows user configuration directory>\DataValidator You can change the path to launch the Data Validation Option Client with a different user configuration directory. INFA_HOME Location of the PowerCenter installation where domains.infa is available. The service variable in the PowerCenter Integration Service that specifies the PowerCenter root directory. Use this variable to define other service variables. For example, you can use $PMRootDir to define subdirectories for other service process variable values. You can set the $PMSessionLogDir service process variable to $PMRootDir/SessLogs.

DV_TRANS_CACHE_SIZE_AUTO

INFA_DOMAINS_FILE

DV_CONFIG_DIR

DV_PM_ROOT_DIR

30

Chapter 4: Data Validation Option Installation and Configuration

Modifying the License Key


You can modify the license key when the license expires or if you decide to change the Data Validation Option license. If you modify the license from the enterprise license to standard license, you will lose all the test details stored for parameterization, bad records, and all the Jasper reports. 1. Select Help > Change License Key. The Change License Key dialog box appears. 2. 3. Click Browse and select the license key. Click OK. Restart the Data Validation Option Client.

JasperReports Server Setup


You must set up a JasperReports Server before you can generate Jasper reports. You can use the JasperReports Server available with Informatica Services or use a standalone JasperReports Server. JasperReports Server available with Informatica Services is called the Reporting and Dashboards Service. Informatica 9.1.0 HotFix 1 and HotFix 2 includes JasperReports Server 4.0.1. Informatica 9.1.0 HotFix 3 and later includes JasperReports Server 4.2.0. If you use Informatica 9.1.0 HotFix 1 or HotFix 2, contact Informatica Global Customer Support to obtain the required patch files that you must install to generate Jasper reports. You can update the heap size of the JasperReports Server available with Informatica from the Administrator tool to improve the performance. Default heap size is 512MB. Update the heap size based on the requirement. In the Administrator tool, select the required Reporting and Dashboards Service and update Maximum Heap Size in the Advanced Properties tab. If you want to use a standalone JasperReports Server, download and install JasperReports Server 4.2.1 for the Windows 32-bit platform, Windows 64-bit platform, or Linux 64-bit platform. You can download the JasperReports Server installer from the following location: http://jasperforge.org/projects/jasperserver/downloads You must update the standalone JasperReports Server with the Datadirect drivers available with Data Validation Option. Copy dwdb2.jar, dwsqlserver.jar, and dworacle.jar from the following location: <Data Validation Option installation directory>\lib. Paste the files in the following directory: <JasperReports Server installation directory>\tomcat\lib. You must install the SSL certificate if you use a JasperReports Server that runs over an HTTPS connection. From the command line, browse to the following location: <Data Validation Option Installation Directory>\DVO
\jre\bin

Run the following command: keytool -importcert -alias <certificate alias name> -file " <certificate path>\<certificate filename>" -keystore ..\lib\security\cacerts

Modifying the License Key

31

CHAPTER 5

Data Validation Option Management


This chapter includes the following topics:
Data Validation Option Management Overview, 32 User Configuration, 32 Multiple PowerCenter Installation Configuration, 34 Informatica Authentication, 34

Data Validation Option Management Overview


Data Validation Option allows you to use multiple users and multiple PowerCenter versions on the same client machine. When you configure a user, Data Validation Option stores the settings in a user configuration directory. You can configure multiple users in the Data Validation Option client to connect to multiple Data Validation Option repositories. You can also use multiple PowerCenter versions on the same client with Data Validation Option.

User Configuration
Data Validation Option creates a configuration directory for each user. The user configuration directory contains the user preferences files, preferences.xml and the DVConfig.properties file. It also contains directories that store log files, reports, and temporary files that the user generates. By default, the user configuration directory is one of the following directories:
32-bit operating systems: C:\Documents and Settings\<user name>\DataValidator\ 64-bit operating systems: C:\Users\<user name>\DataValidator\

You can specify or change the user configuration directory through a batch file, through the --confdir option in the command line, or through the environment variable DV_CONFIG_DIR.

32

Preferences File
Data Validation Option stores connection information in the preferences file for each user. Data Validation Option creates the preferences file in the user configuration directory. When the user opens Data Validation Option, the user does not have to enter Data Validation Option repository information. Data Validation Option reads the connection information from the preferences file. If you do not want users to have access to the database password, you can create users and preference files through the DVOCmd CreateUserConfig. Data Validation Option creates a preference file for each user. If you create users through the CreateUserConfig command, each additional user must still perform all the configuration steps.

Changing the User Configuration Directory through a Batch File


You can create a batch file with commands to run Data Validation Option with a specific user configuration directory. Create a batch file containing the following command: "<Data Validation Option Installation Directory>
\DVOClient.exe" <User Configuration Directory>

For example, "C:\Program Files\Informatica9.1.0\DVO\DVOClient.exe" C:\DVOConfig_Dev

RELATED TOPICS:
Command Line Integration on page 103

Changing the User Configuration Directory through an Environment Variable


You can change the user configuration directory through an environment variable on the Data Validation Option Client machine. To change the user configuration directory through an environment variable, create an environment variable, DV_CONFIG_DIR, with the value set as the full file path for the user configuration directory.

Multiple Data Validation Option Repository Access


The Data Validation Option repository stores the objects and tests that you create in Data Validation Option. If you work with multiple repositories, you must specify a unique user configuration directory for each repository. For example, you want separate the Data Validation Option repositories for your development and production environments. To specify unique user configuration directories for each repository, create a batch file for each repository that starts the Data Validation Option Client and specify the user configuration directory. Suppose you install Data Validation Option in the default directory on 32-bit Windows. You want to set the development user configuration directory to C:\DVOConfig_Dev and the production user configuration directory to C: \DVOConfig_Prod. Create two batch files that start the Data Validation Option Client. For the development environment, enter the following text in the batch file: "C:\Program Files\Informatica9.1.0\DVO \DVOClient.exe" C:\DVOConfig_Dev. For the production environment, enter the following text in the batch file: "C:\Program Files\Informatica9.1.0\DVO \DVOClient.exe" C:\DVOConfig_Prod.

User Configuration

33

Multiple PowerCenter Installation Configuration


You can configure a batch file to use a different PowerCenter version for Data Validation Option. If you want to use multiple PowerCenter versions with an installation of Data Validation Option, create batch files for each PowerCenter version. Create a batch file with the following entries:
@ECHO OFF SET INFA_HOME=<INFA HOME PATH> DVOClient.exe

In the batch file, you must set the INFA_HOME environment variable to the PowerCenter version that you need to access and launch Data Validation Option. After you create a batch file, you must modify the Data Validation Option shortcuts to call the batch file instead of DVOClient.exe. If you want to use DVOCmd with a different PowerCenter installation, provide DVOCmd.exe instead of DVOClient.exe in the batch file.

Informatica Authentication
Administrators can enable Informatica authentication so that the users must use valid Informatica domain credentials to use the Data Validation Option Client. By default, Data Validation Option Client does not validate the users that launch the Data Validation Option Client. You can enable Informatica authentication if you have PowerCenter 9.0.1 or later. Informatica authentication validates over a secure connection with TLS if you have enabled TLS in the Informatica domain. To configure Informatica authentication, you must have an Informatica login credential with administrator privileges. The information you require to configure Informatica authentication is available in the nodemeta.xml file in machine that hosts the Informatica services. You can also update the Informatica authentication properties through the DVOCmd command UpdateInformaticaAuthenticationConfiguration.

Data Validation Option Users and Informatica Users


Data Validation Option users and users with Informatica credentials can launch Data Validation Option Client. Users connect to a Data Validation Option schema when they use the Data Validation Option Client. You can enable Informatica authentication to enable valid Informatica domain users to log in to Data Validation Option. To configure Informatica authentication, you must have an Informatica login credential with administrator privileges. To enable authentication, map each Data Validation user to an Informatica user. If the two users have the same name, then Data Validation Option maps the users automatically. If the two users have different names, then you can map Data Validation Option users to Informatica users with the DVOCmd command LinkDVOUsersToInformatica. The permissions of the Informatica user determines the PowerCenter metadata access for the associated Data Validation Option user, so it is important to ensure that those permissions and privileges are set correctly.

34

Chapter 5: Data Validation Option Management

Security
You can enable the Transport Layer Security (TLS) protocol for secure authentication of Informatica domain users in Data Validation Option. Informatica authentication in Data Validation Option enables connection in a secure network with TLS. Enable TLS in Informatica Administrator to connect to Data Validation Option over a secure connection. You can enable TLS in the Informatica Administrator from the Informatica domain properties. After you enable TLS for the domain, configure Informatica authentication in Data Validation Option Client to use TLS. The properties for TLS are accessed from the Informatica domain.

Informatica Authentication Parameters


You must configure the host, port, and tlsenabled elements available in the nodemeta.xml file to enable Informatica authentication. The nodemeta.xml file is available in the following location:
<Informatica_services_installation_directory>\isp\config

The elements of the nodemeta.xml file contains all the attributes associated with the Informatica domain Configure the following elements available in the nodemeta.xml file to enable Informatica authentication:
Element host port Description Name of the machine that hosts Informatica services. Port through which Data Validation Option accesses Informatica domain. Note: Do not use the value available in the httpport element. tlsEnabled Indicates whether TLS is enabled in the domain. If TLS is not enabled in the Informatica domain, tlsEnabled element is not available in the XML file.

Configuring Informatica Authentication


You can configure Informatica authentication in Data Validation Option to enable users to access Data Validation Option Client with their Informatica credentials. 1. In the Data Validation Option Client, click File > Settings > Preferences. The Preferences dialog box appears. You can also double-click on the Data Validation Option user in the navigator. 2. 3. 4. 5. 6. 7. Select Informatica Authentication . Select Enable Informatica User Authentication . Enter the Informatica domain host name and port specified in the nodemeta.xml. Select Is Secure Connection to authenticate the user over a secure connection. Ensure that TLS is enabled in the Informatica domain. Click Test to test the domain settings. Click Save to save the domain settings. Data Validation Option Client prompts you to enter the Informatica login credentials when you restart the application.

Informatica Authentication

35

CHAPTER 6

Repositories
This chapter includes the following topics:
Repositories Overview, 36 Adding a Repository, 36 Editing Repositories, 37 Deleting Repositories, 37 Refreshing Repositories, 37 Metadata Export and Import, 38 Metadata Manager Integration, 39

Repositories Overview
Data Validation Option connects to a PowerCenter repository to import metadata for PowerCenter sources, targets, folders, and connection objects. Data Validation Option also connects to a PowerCenter repository to create mappings, sessions, and workflows in the Data Validation Option target folder. When you add a repository to Data Validation Option, you add either a source or target repository. You can add one target repository and multiple source repositories. The target repository contains metadata for PowerCenter sources, targets, folders, and connection objects. It also contains the Data Validation Option target folder. The target folder stores the mappings, sessions, and workflows that Data Validation Option creates when you run tests. Do not store other PowerCenter mappings, sessions, or workflows in this folder. A source repository contains metadata for PowerCenter sources, targets, and folders. Add source repositories to Data Validation Option if you want to compare tables from different repositories. When you add a source repository, you must verify that all connection objects in the source repository also exist in the target repository. Data Validation Option uses the connection objects in the target repository when you run tests on table pairs. The version number for a source repository can differ from the version number for the target repository. The version numbers for two source repositories can also differ.

Adding a Repository
1. Right-click INFA Repositories in the Navigator.

36

2.

Select Add Repository. The Repository Editor dialog box appears.

3.

Enter the repository properties. Set Contains Target Folder to true when you add a source repository and to false when you add a target repository.

4.

Click Test to test the repository connection. Data Validation Option verifies the connection properties. If the repository is a target repository, Data Validation Option also verifies the PowerCener Integration Service, and that the Data Validation Option target folder and option results warehouse connection exist in the repository.

Editing Repositories
u

To edit a repository, right-click the repository, and select Edit Repository, or double-click the repository. The Repository Editor dialog box appears. You can update any property that is enabled.

Deleting Repositories
u

To delete a repository from Data Validation Option, right-click the repository, and select Delete Repository. Data Validation Option deletes the repository and all table pairs, single tables, tests, and views based on the repository data.

Refreshing Repositories
You refresh a source repository when the contents of the PowerCenter repository have changed. You usually refresh a target repository only when there are additions or changes to connection objects. When you refresh a repository, Data Validation Option reimports source, target, folder, and connection metadata from the PowerCenter repository. Therefore, Data Validation Option objects that use changed or deleted PowerCenter objects might no longer be valid after you refresh a repository. If you created table pairs, single tables, or tests with tables that were deleted from the PowerCenter repository, Data Validation Option deletes them when you refresh the repository. 1. To refresh all repositories at once, right-click INFA Repositories in the Navigator, and select Refresh All Repositories. To refresh one repository, right-click the repository, and select Refresh Repository

Editing Repositories

37

2.

When you refresh one repository, you can select the objects to refresh. Select one of the following options: Everything Data Validation Option reimports all source, target, folder, and connection metadata. It updates the folder list, and the Sources and Targets folders in the Navigator. Connections Data Validation Option reimports connection metadata. Select this option when a PowerCenter user adds, removes, or updates connection objects. Folder List Data Validation Option reimports folder metadata. It updates the folder list in the Navigator. Select this option when a PowerCenter user adds or removes folders. Folders (Sources and Targets) Data Validation Option reimports source and target metadata. It refreshes the contents of the Sources and Targets folders in each folder in the repository. Select this option when a PowerCenter user adds, removes, or modifies sources or targets in folders. You can also refresh repository folders individually. You might refresh a folder after you refresh the folder list and Data Validation Option imports a new folder. To refresh a repository folder, right-click the folder in the Navigator, and select Refresh Folder (Sources and Targets) . Data Validation Option refreshes the contents of the Sources and Targets folders within the folder you refresh. Note: Refreshing everything in a repository or refreshing all repositories can take several minutes to several hours, depending on the size of the repositories. If you work with a small number of repository folders, you can shorten refresh time by refreshing the folders individually.

Exporting Repository Metadata


You can export repository metadata to a file. You might want to export repository metadata when migrating from a development to a production environment or if you are asked to do by Informatica Global Customer Support. To export repository metadata from Data Validation Option, right-click the repository, and select Export Metadata. Data Validation Option prompts you for a file name and file path.

Metadata Export and Import


Data Validation Option allows you to export and import test metadata from the repositories. Metadata import and export allows users to share tests and allows rapid generation of tests through scripting. Scripting is particularly useful in the following scenarios:
You have a very large number of repetitive tests for different table pairs. In this situation, it might be faster to

generate the tests programmatically.


The source-to-target relationships and rules are defined in a spreadsheet. This often happens during data

migration. You can script the actual Data Validation Option tests from the spreadsheets. You can import and export the following metadata:
Table Pairs Single Tables PowerCenter Sources

38

Chapter 6: Repositories

SQL views Lookup views Join views

RELATED TOPICS:
Metadata Import Syntax on page 138

Exporting Metadata
Data Validation Option allows you to export selected objects, such as table pairs or SQL views, and all of their dependencies, such as tables, to an XML file. You can also export all objects to an XML file.
u

To export an object, right-click the object and select Export Metadata. To export all objects, select File > Export All Metadata. Data Validation Option prompts you for the metadata export file name and directory.

Importing Metadata
When you import metadata from an XML file, you can overwrite repository objects such as table pairs and views that have the same type and same name as objects you are importing. To overwrite repository objects when you import, select File > Import Metadata (Overwrite). To import metadata without overwriting repository objects, select File > Import Metadata. When you import metadata without overwriting objects, Data Validation Option stops importing metadata if an object in the import file and an object in the repository are of the same type and have the same name. When you import metadata, you can automatically generate value tests as you do when you right-click a table pair and select Generate Value Tests. To do this, use the generate-tests command in the import XML file. Place the command at the end of the metadata definition for the table pair. For example, to generate value tests for a table pair named CustDetail_CustStage, add the following lines to the import XML file at the end of the table pair metadata definition:
<Commands> generate-tests("CustDetail_CustStage");

Metadata Manager Integration


You can view the metadata of data sources if you configure Metadata Manager integration in Data Validation Option. You can analyze the impact of test results on data sources if you enable Metadata Manager integration. You can view the metadata of the data source in the PowerCenter repository. To view the metadata of data sources, you must setup a Metadata Manager Service in the Informatica domain. You must create a PowerCenter resource for the PowerCenter repository that contains the data source. Right-click on a data source in the repository and select Get Metadata to view the metadata of the data source.

Metadata Manager Integration

39

Configuring Metadata Manager Integration


Configure Metadata Manager integration in Data Validation Option to view the data lineage of data sources. 1. 2. 3. 4. 5. 6. 7. 8. Right-click the repository for which you want to enable Metadata Manager Integration and select Edit Repository. Select Enable Metadata Manager Integration . Select whether the Metadata Manager Service runs on a secure connection. Enter the server host name. Enter the Metadata Manager Service port. Enter the name of the PowerCenter resource. Click Test to the test the settings. Click Save.

40

Chapter 6: Repositories

CHAPTER 7

Table Pairs
This chapter includes the following topics:
Table Pairs Overview, 41 Table Pair Properties, 41 Adding Table Pairs, 48 Editing Table Pairs, 49 Deleting Table Pairs, 49 Overall Test Results, 49

Table Pairs Overview


A table pair is the basis for all tests that compare one table to another. You can select a relational table, flat file, lookup view, SQL view, or join view as one or both tables in a table pair. Data Validation Option considers Table A as the master table and Table B as the detailed table. When you select tables in a table pair, select the master table as Table A and detailed table as Table B to improve the performance. If you have the enterprise license, you can store all the error records after you run a test for the table pair. You can also use a parameter file that the PowerCenter Integration Service applies when Data Validation Option runs the sessions associated with the table pair.

Table Pair Properties


You can view table pair properties by selecting a table pair in either the Navigator or the Table Pairs tab and viewing the properties. The properties vary depending on the types of objects you select for the table pair. The following table describes the table pair properties:
Property Table A/B Conn A/B Description The first or second table in the table pair. Connection details for the table.

41

Property Execute where clause A/B

Description Filters the records that the PowerCenter Integration Service reads from the database. Enter the a valid PowerCenter Boolean expression or an SQL WHERE clause without the WHERE keyword. Controls which test logic that Data Validation Option converts to a PowerCenter mapping and which test logic it pushes to the database. You can select one of the following options: - Default - Data Validation Option converts all test logic to a PowerCenter mapping and applies sorting to the data source. - WHERE clause, Sorting, and Aggregation in DB - Data Validation Option pushes the WHERE clause, sorting logic for joins, and all aggregate tests to the database. Data Validation Option converts all other test logic to a PowerCenter mapping. - Already Sorted Input - PowerCenter mapping does not sort the input. If the data source is not sorted, tests may fail. Table pair description. By default, Data Validation Option uses "Joined <Table A><Table B>" for joined table pairs. It uses "<Table A>-<Table B>" for table pairs that are not joined. Identifier for the table pair that you can use when you run Data Validation Option tests at the command line. Join condition for the tables.

Optimization Level

Description

External ID

Table Join

RELATED TOPICS:
Single-Table Constraints on page 62 Tests for Single-Table Constraints on page 70

Connection Properties
Choose the connection properties based on the data source type. You must select a connection for all the data sources except for flat files. For flat files, you must provide the source directory and the file name. Connections are PowerCenter connection objects created in the Workflow Manager.

Relational Connection Properties


Choose the relational connection properties for Microsoft SQL Server, Oracle, IBM DB2, Netezza, and PowerExchange for DB2 data sources. Configure the following properties when you select a relational data source:
Property Connection Override Owner Name Description PowerCenter connection object to connect to the relational data source. Override the database name and schema of the source. For example, a Microsoft SQL Server table is identified by <database>.<schema>.<table>. To override the database and the schema, enter <new database name>.<To change the schema, enter <new schema name> in the text box. You cannot change only the database name.

42

Chapter 7: Table Pairs

SAS and Salesforce Connection Properties


You can use a SAS or Salesforce data source in Data Validation Option. Select the PowerCenter connection object when you select an SAS or Salesforce data source. Note: You cannot override the owner name for SAS and Salesforce data sources.

SAP Connection Properties


You must configure the SAP authentication information to use SAP data sources. Configure the following properties when you select an SAP data source:
Property Connection SAP User Name Description PowerCenter connection object to connect to the SAP data source. SAP source system connection user name. Must be a user for which you have created a source system connection. Password for the user name. SAP client number. Language you want for the mapping. Must be compatible with the PowerCenter Client code page. Data Validation Option does not authenticate the value. Ensure that you enter the correct value so that the tests run successfully. SAP Connect String Type A or Type B DEST entry in saprfc.ini.

SAP Password SAP Client SAP Language

SAP Data Sources in Data Validation Option


You cannot override the owner name for an SAP data source. Data Validation Option uses the stream mode for installation of the ABAP programs and cannot use the FTP mode. SAP data sources must not contain the backslash (/) character in the field names.

Flat File Connection Properties


You can use the flat file data sources in the PowerCenter repository. Configure the following properties when you select a flat file data source:
Property Source Dir Source File Description Directory that contains the flat file. The path is relative to the machine that hosts Informatica Services. File name with the file name extension.

Database Processing
When you include a large table in a table pair, you can optimize the way Data Validation Option joins table data. Data Validation Option uses joins in value and set tests. To run value tests on a table pair, you must join the tables based on the related keys in each table. When you run set tests, Data Validation Option joins the tables.

Table Pair Properties

43

By default, Data Validation Option joins the tables with an inner equijoin and sorts the rows in the table. The join condition uses the following WHERE clause syntax:
Table A.column_name = Table B.column_name

If one of the tables in a table pair contains a large volume of data compared to the other table, you can improve test performance by designating the larger table as the detail source and the smaller table as the master source. Select smaller table as Table A and the larger table as Table B. The PowerCenter Integration Service compares each row of the master source against the detail source when it determines whether to join two records. If both tables in a table pair are large, you can improve test performance by using sorted input. When you use sorted data, the PowerCenter Integration Service minimizes disk input and output when it runs a test. You can further increase performance for large relational tables by pushing the sorting logic to the database.

Pushing Test Logic to the Database


You can push some test logic for a relational table to the source database. By default, Data Validation Option creates a PowerCenter mapping for each table pair. When you run a test, PowerCenter runs the mapping logic in a session. In some cases, you can significantly increase test performance by processing logic in the database instead of a PowerCenter session. You can push the WHERE clause, logic for aggregate tests, and sorting logic for joins to the source database. Pushing the WHERE clause to the database can reduce the number of rows the PowerCenter Integration Service reads when it runs a Data Validation Option test. For example, Table A is a table of U.S. customers, and you want to test data only for customers in California. You enter a WHERE clause such as STATE = 'CA'. Data Validation Option creates a mapping in PowerCenter that reads all U.S. customers. The mapping might contain a Filter transformation that removes all records for customers outside of California. If you push the WHERE clause to the database, the database filters customer records. The PowerCenter Integration Service reads records for California customers only. Pushing the WHERE clause to the database increases the performance of the PowerCenter session because the Integration Service reads a small subset of records instead of all records in the table. Pushing aggregate test logic to the database can also reduce the number of rows the PowerCenter Integration Service reads from the database. For example, you use a COUNT test to compare the number of non-null records in two Customer_ID columns. If you push the test logic to the database, the PowerCenter Integration Service does not have to import all customer records in order to count them. Pushing the sorting logic for joins to the database causes the database to sort records before it loads them to PowerCenter. This minimizes disk input and output when Data Validation Option runs a test. The reduction in input and output is greatest for large tables, so you might want to push sorting logic to the database when you run tests on tables with large volumes of data. Pushing test logic to the database slightly increases the load on the database. Before you push test logic to the database, you must decide whether the increased performance of the test outweighs the increased load on the database.

WHERE Clauses
Use a WHERE clause to limit the records returned from a data source and made available for a test. Since PowerCenter pulls data from each data source individually, you can provide separate WHERE clauses or filters for the data pulled from each table. Enter the WHERE clause without the WHERE keyword, for example, CITY <> 'London'. The PowerCenter Integration Service is case-sensitive when it reads WHERE clauses. This functionality corresponds to the use of the Filter transformation in PowerCenter.

44

Chapter 7: Table Pairs

Data Validation Option does not check the WHERE clause syntax. If the PowerCenter Integration Service executes the WHERE clause, any valid PowerCenter expression, including expressions that use PowerCenter functions, is allowed. If the PowerCenter syntax is not valid, a mapping installation error occurs. Use the followingguidelines if the data source executes the WHERE clause:
Relational data source. The WHERE clause must be a valid SQL statement. If the SQL statement is not valid, a

runtime error occurs. Enter a WHERE clause in the SQL format when you pushdown the WHERE clause into the database.
SAP data source. The WHERE clause must be a valid SAP filter condition in the ERP source qualifier. Salesforce data source. The WHERE clause must be a valid SOQL filter condition. SAS data source. The WHERE clause must be a valid Whereclause Overrule condition in the SAS source

qualifier. When you enter a WHERE clause, consider the following issues:
Data Validation Option uses two WHERE clauses instead of one. A typical SQL statement has one WHERE

clause. Data Validation Option, however, has one WHERE clause for Table A and one for Table B. Therefore, it is possible that more data comes from one table than the other. For example, applying emp_id < 10 to Table A but not Table B results in only nine records coming from Table A and all records from Table B. This affects OUTER_VALUE and aggregate tests, which might or might not be what you intended. However, when you compare production to development where the production environment has three years of data and development only has two weeks, applying a WHERE clause to production equalizes the data sets.
Certain validation problems can be solved through a nested SQL WHERE clause. For example, if you want to

filter for employees with disciplinary issues, use the following WHERE clause (assuming it is executed in the database):
emp_id IN (SELECT DISTINCT emp_id FROM table_discipline) Because the filter condition you enter in the WHERE clause applies to all tests in the table pair, Data Validation

Option applies the WHERE clause before it joins the tables. This can improve performance when the WHERE clause filters a large percentage of rows from the source table because the PowerCenter Integration Service processes fewer rows later in the mapping. If you want to enter a condition that filters a small percentage of rows, or you want to apply different filters for different tests, you can enter a filter condition in the Table Pair Test Editor dialog box.

Table Joins
You can join or unjoin a table pair. You must join the tables if you want to run a VALUE or OUTER_VALUE test. Data Validation Option ignores joins for all set and aggregate tests. To create a joined table pair, define one or more conditions based on equality between the tables. For example, if both tables in a table pair contain employee ID numbers, you can select EMPLOYEE_ID as the join field for one table and EMP_ID as the join field for the other table. Data Validation Option performs an inner equijoin based on the matching ID numbers. You can join tables only on fields of like datatypes. For example, you can join an INT field to a DECIMAL field, but not to a DATETIME or VARCHAR field. Data Validation Option supports numeric, string, datetime and binary/other datatypes. Joins are not allowed on binary/other datatypes. You can create a join with one set of fields or with multiple sets of fields from each table if the table requires this to produce unique records. Note that additional sets of fields increase the time necessary to join two sources. The order of the fields in the join condition can also impact the performance of Data Validation Option tests. If you use multiple sets of fields in the join condition, Data Validation Option compares the ports in the order you specify.

Table Pair Properties

45

When you create a join, you can select a field from each table or enter an expression. Enter an expression to join tables with key fields that are not identical. For example, you have two customer tables that use different cases for the LAST_NAME field. Enter the following expression for one of the join fields: lower(LAST_NAME) When you enter an expression for one or both join fields, you must specify the datatype, precision, and scale of the result. The datatype, precision, and scale of both join fields must be compatible. The expression must use valid PowerCenter expression syntax. Data Validation Option does not check the expression syntax.

Bad Records Configuration


If you have the enterprise license, you can store up to 16,000,000 bad records per test to perform advanced analysis and up to 1000 bad records for reporting. If you have the standard license, you can store up to 1000 bad records for reporting and analysis. For reporting, you can set the number of max bad records in the mapping properties in the Preferences dialog box. The default number of bad records is 100. You can set up to 1000 bad records for reporting. For advanced analysis, you can enter the max bad record for detailed analysis along with the file delimiter in the Detailed Error Rows Analysis section in mapping properties. The default number of bad records is 5000. You can set up to 16,000,000 bad records. At the table pair or single table level, you can choose to store all the bad records from the tests for the table pair or single table. Select Save all bad records for test execution in the Advanced tab when you create or edit a table pair or single table. Data Validation Option stores the following tests for table pairs:
Value Outer Value Set

Data Validation Option stores the following constraint tests for single tables:
Value Unique

You must select whether you want to store the bad records in a flat file or a table in the Data Validation Option schema. If you store bad records in a flat file, you can optionally enter the name of the file. Data Validation Option appends test information to the name and retains the file extension. Note: If you modify the file delimiter in the preferences file, run the InstallTests command with the forceInstall option for the existing table pairs or single tables that you already ran. You can also edit and save the table pair or single table from the Data Validation Option Client before you run the test. If you modify the bad records value, you need not reinstall the tests.

Bad Records in Flat File


If you configure to store bad records in flat file, Data Validation Option creates the flat file in the machine that runs Informatica services. The flat files that Data Validation Option generates after running the tests are stored in the following folder:<PowerCenter installation directory>\server\infa_shared\TgtFiles You can modify the folder from the Administrator tool. Edit the $PMTargetFileDir property for the PowerCenter Integration Service. Data Validation Option generates a folder for each of the table pair or single table. The name of the table pair or single table is in the following format: TablePairName_TestRunID or SingleTableName_TestRunID

46

Chapter 7: Table Pairs

Data Validation Option creates flat files for each test inside the folder. The name of the flat file is in the following format: <user defined file name>_ TestCaseType_TestCaseColumn A_TestCaseColumnB_TestCaseIndex.<user defined
file extension>

You can get the Test Case Index from the Properties tab and the Test Run ID from the results summary of the test in the detail area. You can get the Table Pair ID/Single Table ID from the Table Pair or Single Table properties. For example, you enter the file name as BAD_ROWS.txt when you configure the table pair or single table and you run an outer value test on fields FIRSTNAME and FIRSTNAME. The test-case index is 1 and the test-fields are expressions. The bad records file after you run the test is of the format, BAD_ROWS_OUTER_VALUE_ExprA_ExprB_1.txt. Data Validation Option supports all the file delimiters that PowerCenter supports. When you enter non-printable characters as delimiters, you should enter the corresponding delimiter code in PowerCenter. When you import these files in PowerCenter, you have to manually create the different data fields since the code appears in the place of delimiter in the bad records file. Caution: Data Validation Option uses comma as a delimiter if there a multiple primary keys. You must not use comma as a file delimiter. When you run the tests for the table pair or single table, Data Validation Option stores the details of the bad records along with the following format:
Table Pair Name Table A Name Table A Connection Table B Name Table B Connection Test Definition Test Run Time Test Run By User Key A[],Result A[],Key B[],Result B[]

If the tests pass, Data Validation Option still creates a flat file without any bad records information.

Bad Records in Database Schema Mode


If you choose to save detail bad records in the Data Validation Option schema, the bad records are written into the detail tables. The following table describes the tables to which Data Validation Option writes the bad records based on the type of test:
Test Type Value, Outer Value, Value Constraint Unique Constraint Set Table ALL_VALUE_RESULT_DETAIL Columns TEST_RUN_ID, TEST_CASE_INDEX, KEY_A, VALUE_A, KEY_B, VALUE_B TEST_RUN_ID, TEST_CASE_INDEX, VALUE TEST_RUN_ID, TEST_CASE_INDEX, VALUE_A, VALUE_B

ALL_UNIQUE_RESULT_DETAIL ALL_SET_RESULT_DETAIL

Note: You must ensure that the database table has enough table space to hold all the bad records. The test details of all the tests that you ran in Data Validation Option is available in the TEST_CASE table. The test installation details of the tests are available in the TEST_INSTALLATION table. You can obtain the TEST_ID of a test from the TEST_INSTALLATION table. You need TEST_ID of a test to query the complete details of a test from the TEST_CASE table.

Table Pair Properties

47

You can get the Test Case Index from the Properties tab and the Test Run ID from the results summary of the test in the detail area. You can get the Table Pair ID/Table ID from the Table Pair or Single Table properties. Foe example, you ran a table pair with a value test and outer value test. The following SQL query is a sample query to retrieve the information of bad records of a test with Test Case Index as 5.
select ALL_VALUE_RESULT_DETAIL.*,TEST_CASE.* from ALL_VALUE_RESULT_DETAIL , TEST_RUN, TEST_INSTALLATION, TEST_CASE where ALL_VALUE_RESULT_DETAIL.TEST_RUN_ID=TEST_RUN.TEST_RUN_ID and TEST_RUN.TEST_INSTALLATION_ID=TEST_INSTALLATION.TEST_INSTALLATION_ID and TEST_INSTALLATION.TABLE_PAIR_ID=TEST_CASE.TEST_ID and ALL_VALUE_RESULT_DETAIL.TEST_CASE_INDEX=TEST_CASE.TEST_CASE_INDEX and ALL_VALUE_RESULT_DETAIL.TEST_RUN_ID=220

Parameterization
If you have the enterprise license, you can perform incremental data validation through parameterization. You can use parameters in the WHERE clause in Table Pairs and Single Tables. Before you use a parameter in the WHERE clause, you must enter the name of the parameter file and add the parameters in the Advanced tab of a table pair or single table definition. You must specify the data type, scale, and precision of the parameter. After you add the parameters, you can use them in a WHERE clause to perform incremental validation. You can validate the expression with the parameters that you enter in a table pair or single table. The parameter file must be in a location accessible to the Data Validation Option Client. The parameters in the parameter file must be in the format, $$<parameter name>=value. Ensure that the parameter that you add in a table pair or single table is available in the parameter file. When you run a test, Data Validation Option looks up the value of the parameter from the parameter file and runs the WHERE clause based on that value. If the Informatica Services run on a Windows machine, you can place the parameter files in a folder on the server. Ensure that the Data Validation Option Client can access the folder. In the Data Validation Option Client, enter the parameter file location as a network path to the parameter file in the server in the following format: \\<server
machine host name>\<shared folder path>\parameter file name

If the Informatica Services run on a UNIX machine, you can place the parameter files in a folder on the server. Install DVOCmd on the server. You can configure parameterization on the Data Validation Option Client and provide the absolute path of the parameter file and run the tests from the server with DVOCmd. Alternatively, you can ensure that the Data Validation Option Client can access folder and enter the parameter file location as a network path to the parameter file in the server. Run the tests from the Data Validation Option Client.

Adding Table Pairs


You can create a table pair from the file menu or from the shortcut in the menu bar. 1. 2. Select the folder to which you want to add the table pair. Click on the table pair shortcut on the menu bar or click on File > New > Table Pair. The Table Pair Editor window appears. 3. Browse and select the data source that you want to use as Table A and Table B of the table pair.

48

Chapter 7: Table Pairs

You can search for a data source by name or path. You can search for lookup views, sql views, and join views only with their names. 4. 5. Click Edit and configure the connection properties for the table. Enter the WHERE clauses you want to execute on the data sources. Enter the WHERE clause in the data source format if you run the WHERE clause in the database. Otherwise, enter the WHERE clause in the PowerCenter format. 6. 7. Enter the description for the table pair. Enter the external ID for the table pair. You can use the external ID to execute the table pair from the command line. 8. If the data source is a relational source or an application source, you can choose whether to execute the WHERE clause within the data source. If you choose to run the WHERE clause within the data source, PowerCenter Integration Service passes the WHERE clause to the data source before PowerCenter Integration Service loads data. Ensure that the WHERE clause you enter is in the data source format. 9. 10. 11. Select the database optimization level. Enter the join fields for the two tables. If you have the enterprise license, click the Advanced tab. You can select whether to store all the bad records for test execution and enter the storage details. You can also enter the details of the parameter file and the parameters. 12. Click Save.

Editing Table Pairs


To edit a table pair, right-click the table pair in the Navigator and select Edit Table Pair. You can also edit a table pair by double-clicking the table pair. When you edit a table pair, the Table Pair Editor dialog box opens.

Deleting Table Pairs


To delete a table pair, right-click a table pair in the Navigator and select Delete Table Pair. You can also select a table pair and press the Delete key to delete a table pair. When you delete a table pair, Data Validation Option deletes all of the associated tests. Data Validation Option does not delete lookup, join, or SQL views used in the table pair.

Overall Test Results


After you add a table pair to Data Validation Option, you can view the properties by selecting the table pair in the Navigator.

Editing Table Pairs

49

When you select a table pair in the navigator, all the tests run on the table pair appears on the right pane. Also, the run summary of all the tests appear on the top of the page. Select a test to view the details of that test in the bottom pane. You can view the test properties in the Properties tab and the test results in the Results tab. You view important tests details like Test Case ID in the Properties tab along with other test details.

50

Chapter 7: Table Pairs

CHAPTER 8

Tests for Table Pairs


This chapter includes the following topics:
Tests for Table Pairs Overview, 51 Test Properties, 51 Adding Tests, 57 Editing Tests, 57 Deleting Tests, 57 Running Tests, 58 Automatic Test Generation, 58 Bad Records, 61

Tests for Table Pairs Overview


You can run the following types of tests on table pairs: Aggregate Includes COUNT, COUNT_DISTINCT, COUNT_ROWS, MIN, MAX, AVG, and SUM. Set Includes AinB, BinA, and AeqB. Value Includes VALUE and OUTER_VALUE. Note: When you run tests, the target folder must be closed in the Designer and Workflow Manager. If the target folder is open, Data Validation Option cannot write to the folder, and the tests return an error.

Test Properties
You can apply properties for the table pair test.

51

The following table describes the test properties:


Property Function Field A/B Description The test you run such as COUNT, COUNT_DISTINCT, AinB, or VALUE. The field that contains the values you want to compare when you run the test. You must select a field from each table in the table pair. Allows you to filter records after Data Validation Option joins the tables in the table pair. Enter a valid PowerCenter Boolean expression. The arithmetic operator that defines how to compare each value in Field A with each value in Field B. The allowable margin of error for an aggregate or value test that uses the approximate operator. You can enter an absolute value or a percentage value. The number of records that can fail comparison for a test to pass. You can enter an absolute value or a percentage value. Ignores case when you run a test that compares string data. Ignores trailing spaces when you run a test that compares string data. Data Validation Option does not remove the leading spaces in the string data. Allows null values in two tables to be considered equal. Information about a test. Data Validation Option displays the comments when you view test properties in the Properties area. Allows you to enter an expression for Field A or Field B. The datatype for the expression if Field A or Field B is an expression. The precision for the expression if Field A or Field B is an expression. The scale for the expression if Field A or Field B is an expression B.

Condition A/B

Operator

Threshold

Max Bad Records

Case Insensitive Trim Trailing Spaces

Null=Null Comments

Field A/B is Expression Datatype Precision Scale

Tests
The following table describes the table pair tests:
Test COUNT Description Compares the number of non-null values for each of the selected fields. This test works with any datatype. The fields you compare must be of the same general datatype, for example, numeric- tonumeric or datetime-to-datetime. Compares the distinct number of non-null values for each of the selected fields. This test works with any datatype except binary. The fields you compare must be of the same general datatype, for example, numeric- to-numeric or datetime-to-datetime. Compares the total number of values for each of the selected fields. This test counts nulls, unlike the COUNT and COUNT_DISTINCT tests. This test works with any datatype.

COUNT_DISTINCT

COUNT_ROWS

52

Chapter 8: Tests for Table Pairs

Test MIN

Description Compares the minimum value for each of the selected fields. This test works with any datatype except binary. The fields you compare must be of the same general datatype, for example, numeric- to-numeric or datetime-to-datetime. Compares the maximum value for each of the selected fields. This test works with any datatype except binary. The fields you compare must be of the same general datatype, for example, numeric- to-numeric or datetime-to-datetime. Compares the average value for each of the selected fields. This test can only be used with numeric datatypes. Compares the sum of the values for each of the selected fields. This test can only be used with numeric datatypes. Determines whether the entire set of values for Field A exist in the set of values for Field B. This test works with any datatype except binary/other. The fields you compare must be of the same general datatype, for example, numeric- to-numeric or datetime-to-datetime. You can use this test to confirm that all values in a field exist in a lookup table. This test examines all values for a column instead of making a row-by-row comparison. Determines whether the entire set of values for Field B exist in the set of values for Field A. Determines whether the entire set of values for the field selected from Table B exist in the set of values for the field selected from Table A. This test works with any datatype except binary/other. The fields you compare must be of the same general datatype, for example, numeric- to-numeric or datetime-to-datetime. You can use this test to confirm that all values in a field exist in a lookup table. This test examines all values for a column instead of making a row-by-row comparison. Determines whether the set of values for the selected fields are exactly the same when compared. This test works with any datatype except binary. The fields you compare must be of the same general datatype, for example, numeric- to-numeric or datetime-to-datetime. You can use this test to confirm that all values in a field exist in a lookup table. This test examines all values for a column instead of making a row-by-row comparison. Determines whether there are any common values between the selected fields. If there are common values, the test returns an error. If there are no common values, the test succeeds. For joined table pairs, this test compares the values for the fields in each table, row-by-row, and determines whether they are the same. If there are any rows that exist in one table but not the other, the rows are disregarded which implies an inner join between the tables. If the fields are both null and the Null=Null option is disabled, this pair of records fails the test. This test works with any datatype except binary. The fields you compare must be of the same general datatype, for example, numeric- to-numeric or datetime-to-datetime. For joined table pairs, this test compares the values for the fields in each table, row-by-row, and determines whether they are the same. If there are any rows that exist in one table but not the other, they are listed as not meeting the test rules which implies an outer join between the tables. For the test to pass, the number of rows for the tables, as well as the values for each set of fields must be equal. If the fields are both null and the Null=Null option is disabled, this set of records fails the test. This test works with any datatype except binary. The fields you compare must be of the same general datatype, for example, numeric- to-numeric or datetime-to-datetime.

MAX

AVG

SUM

SET_AinB

SET_BinA

SET_AeqB

SET_ANotInB

VALUE

OUTER_VALUE

Fields A and B
To create a test, you must select the fields that contain the values you want to compare from each table in the table pair. The fields available in each table appear in Field A and Field B. Select a field from each table.

Test Properties

53

Conditions A and B
You can filter the values for each field in a VALUE or OUTER_VALUE test to exclude rows that do not satisfy the test condition. For example, you want to exclude telephone extension numbers that contain fewer than three characters. Use the following VALUE test:
Table A.EXT = Table B.EXT, Condition A = LENGTH(EXT)<3, Condition B = LENGTH(EXT)<3

The filter condition you enter for a test differs from the WHERE clause you enter for a table pair. Data Validation Option applies the WHERE clause to all tests in the table pair, before it joins the tables. It applies the test filter condition after it joins the tables. You might want to use a test filter condition instead of a filter condition in the WHERE clause when the filter condition does not remove a large percentage of rows. This can improve performance if you run one test on the table pair. Data Validation Option does not check the condition syntax. Any valid PowerCenter expression, including expressions that use PowerCenter functions, is allowed. If the PowerCenter syntax is not valid, a mapping installation error occurs when you run the test. To enter a filter condition for either field, enter the filter condition in the Condition A or Condition B field. Because the PowerCenter Integration Service processes the filter condition, it must use valid PowerCenter syntax. Enter the field name in the filter condition, for example, Emp_ID > 0. Do not include the WHERE keyword.

Operator
The operator defines how to compare the test result for Field A with the test result for Field B. Enter an operator for aggregate and value tests. The following table describes the operators available in the Operator field:
Operator = Definition Equals Description Implies that the test result for Field A is the same as the test result for Field B. F or example, SUM(Field A) is the same as SUM(Field B). Implies that the test result for Field A is not the same as the test result for Field B. Implies that the test result for Field A is less than the test result for Field B. Implies that the test result for Field A is less than or equal to the test result for Field B. Implies that the test result for Field A is greater than the test result for Field B. Implies that the test result for Field A is greater than or equal to the test result for Field B. Implies that the test result for Field A is approximately the same as the test result for Field B. An approximate test must have a threshold value. You can use this operator with numeric datatypes only.

<>

Does not equal

<

Is less than

<=

Is less than or equal to

>

Is greater than

>=

Is greater than or equal to

Is approximately the same as

Note: Data Validation Option compares string fields using an ASCII table.

54

Chapter 8: Tests for Table Pairs

RELATED TOPICS:
BIRT Report Examples on page 119

Threshold
A threshold is a numeric value that defines an acceptable margin of error for a test. You can enter a threshold for aggregate tests and for value tests with numeric datatypes. An aggregate test fails if the number of non-matching records exceed the threshold value. For example, you run a COUNT test that uses the operator and set the threshold to 10. The test passes when the results are within 10 records of each other. In a value test, the threshold defines the numeric margin of error used when comparing two values. For example, you run a VALUE test that uses the = operator. The test compares a REAL field with a value of 100.99 to an INTEGER field with a value of 101. The test passes when the threshold value is at least 0.01. You can enter an absolute value or a percentage value as the threshold. To enter a percentage value as the threshold, suffix the number with a percentage (%) sign. For example, 0.1%. You must configure the threshold if the test uses the approximate operator.

Max Bad Records


Data Validation Option lists records that do not compare successfully as bad records. You can configure a tolerance value for the acceptable number of bad records. By default, for a set or value test to pass, all records must compare successfully. You can configure an acceptable value for the maximum number of bad records. The test passes if the number of bad records does not exceed the max bad records value. You can enter an absolute value or a percentage value for the max bad records. To enter a percentage value as the max bad records, suffix the number with a percentage (%) sign. For example, 0.1%. Value and set tests display bad records on the Results tab.

Case Insensitive
String comparison in PowerCenter is case-sensitive. If you want the PowerCenter Integration Service to ignore case when you run a test that compares strings, enable the Case Insensitive option. This option is disabled by default.

Trim Trailing Spaces


By default, string comparison fails if two strings are identical except one string contains extra spaces. For example, one field value in a test is 'Data Validation' and the other field value is 'Data Validation ' (with three blank spaces after the last character). If you do not trim trailing spaces, the test produces a bad record. If you trim trailing spaces, the comparison passes because the extra spaces are ignored. You might want to trim trailing spaces when the CHAR datatype, which pads a value entered with spaces to the right out to the length of the field, is used. A field of CHAR(20) compared to CHAR(30) fails, even if both fields have the same value, unless you trim the trailing spaces. Enable the Trim Trailing Spaces option if there are spaces after the value entered in a field that should be ignored in the comparison. This option is disabled by default.

Test Properties

55

Null = Null
If null values in Field A and Field B should be considered equal, enable the Null = Null option. For example, a current employee in a table that contains employee information has a null termination date. If two records with null termination dates were compared by a database, they would not be considered equal because SQL does not consider a null value in one field to be equal to a null value in another field. Because business users often consider null values to be equal, the Null = Null option is enabled by default.

Comments
You can enter information about a test in the Comments field. Data Validation Option displays comments in the Properties window when you select the test in the Navigator or the Tests tab.

Expression Definitions
To substitute an expression for a database value in a test field, enable the Field A/B is Expression option. When you enable this option, Data Validation Option disables the Field A or Field B control in the Table Pair Test Editor dialog box and enables the expression-related fields. The expression must use valid PowerCenter expression syntax. Click on Validate after you write the expression. The Data Validation Option Client does not allow you to save the table pair unless you enter a valid expression. The datatype value represents the datatype of the expression after calculation. If you do not select the correct datatype, the test might produce an error. The datatypes you can select are PowerCenter datatypes. The precision and scale must match the precision and scale used in PowerCenter for the datatype or the test might produce an error. The scale for any string or text datatype is zero. The precision and scale for a datetime datatype is 23 and 3, respectively. Note: Data Validation Option does not support the following functions:
User-defined Custom Lookup Variable

Expression Tips
Testing often requires the use of different expressions. PowerCenter functions are described at the end of this guide. The following examples demonstrate how to use expressions for data validation. Concatenation, RTRIM, and SUBSTR Often data transformation involves concatenation or the use of substring functions. The following example tests the result of concatenation transformation:
Expression A: UPPER(first_name || ' ' || last_name) Field B: full_name

IF Statements The IF function is arguably the most popular testing function. The syntax for the IF function is as follows:
IF(condition, if_true_part, if_false_part)

56

Chapter 8: Tests for Table Pairs

The following example shows the IF function used in testing:


Table A sales_usa sales_intl Table B region (either 'USA' or 'INTL') sales

The aggregate validation can be accomplished by two tests:


Test 1: SUM Field A: sales_usa

Expression B: IF(region='USA', sales, 0)


Test 2: SUM Field A: sales_intl

Expression B: IF(region='INTL', sales, 0)

Adding Tests
You can add tests to table pairs one at a time or you can generate tests in batches. You can add any test manually. To add a test to a table pair, right-click the name of the table pair in the Navigator or Table Pairs tab, or right-click in the Tests tab, and select Add Test. The Table Pair Test Editor dialog box opens. You can generate value tests in a batch for table pairs that have tables with matching field names and datatype families. You can also generate value tests in a batch for tables or files within two target folders that have matching table or file names, field names, and datatype families.

RELATED TOPICS:
Automatically Generating Value Tests Comparing Repository Folders

Editing Tests
To edit a test, right-click the test in the Navigator or Tests tab, and select Edit Test. You can also double-click the test name. The Table Pair Test Editor dialog box opens.

Deleting Tests
To delete a test, right-click the test in the Navigator or the Tests tab, and select Delete Test. You can also use the Delete key.

Adding Tests

57

Running Tests
You can run tests from the Data Validation Option client or the command line. Use one of the following methods to run tests:
Select one or more table pairs and click Run Tests. Right-click a folder in the Navigator and select Run Folder Tests. Right-click a test in the Tests tab and select Run Selected Tests.

Data Validation Option runs all tests for a table pair together. You can run tests individually if only one test is set up for the table pair. If you select an individual test, Data Validation Option runs all tests for the table pair. After you run a test, you can view the results on the Results tab. Data Validation Option uses the following logic to determine whether a test passes or fails:
An aggregate test is calculated as A <operator> B. If this relationship is true, the test passes. If the operator

chosen is approximate, then the test is calculated as ABS(A-B) <= Threshold.


A value or set test must produce fewer or an equal number of records that do not match compared to the

threshold value. If there is no threshold value and there are no records that do not match, the test passes. When you run tests, the target repository folders must be closed in the Designer or Workflow Manager. If the target folders are open, Data Validation Option cannot write to the folders, and the tests fail.

Automatic Test Generation


You can compare two repository folders and generate all table pairs, value tests, and count tests between the tables in the two folders. You can also automatically generate tests for existing table pairs. You can generate value and count tests to compare tables during a PowerCenter upgrade or migration from development to production. You can generate tests based on the column name or column position in the tables. Data Validation Option generates the following tests:
OUTER_VALUE test for a set of fields used for the join if the field names and field datatypes match. The

OUTER_VALUE test reports any difference when join field values exist.
VALUE test for fields in which the field names and datatypes match. The VALUE test reports any difference in

actual values for each set of fields.


COUNT_ROWS test for a set of fields used for the join if the field names and field datatypes match. A

COUNT_ROWS test reports if there are difference in the number of rows between the tables. Data Validation Option does not generate tests for fields when the field names or datatypes do not match. It also does not generate tests for binary fields.

Generating Table Pairs and Tests


Use the Compare Tables dialog box to generate table pairs between tables in any two folders and generate tests associated with the table pairs. 1. In Data Validation Option, click Action > Compare Tables. The Compare Tables dialog box appears. 2. Select the repositories that contain the tables that you want to compare.

58

Chapter 8: Tests for Table Pairs

3. 4. 5.

Select the folders that contain the tables that you want to compare. Select Sources or Targets from the sub-folders. Select the database connection, for each folder. If there are tables in the folder that require a different database connection, modify the database connection in table pairs and tests after auto-generation is complete.

6. 7. 8. 9. 10. 11. 12. 13. 14. 15.

If the database is IBM DB2, enter the database owner name. If the data source is SAP, click Select Source Parameters and configure the SAP source parameters. If the folders contain flat files, enter the path that contains the source files and target files in the Source Dir field. Select whether to compare columns by name or position. If you select compare columns by position, select the column to which you want to skip in the table. Select the folder to store the table pairs and tests. Choose to generate count tests or count and value tests. Select Sort in DB to sort the tables before Data Validation Option generates tests. Choose whether to trim the trailing spaces in the tests. To generate value tests for flat files, Salesforce tables, or tables without primary keys, specify the name and location of the text file that contains information of primary keys for the flat files or tables. The file must contain the name of the flat file or table and the primary key separated by comma. Each entry must be in a new line. If a table has more than one key, each key must be in a new entry. For example:
flatfile_dictionary_10rows,FIELD1 flatfile_dictionary_10rows,FIELD2 flatfile_dictionary_10rows_str,FIELD1

16.

Select whether you want to skip generation of all the tests or generate count tests if a primary key does not exist for a table. Data Validation Option generates table pairs even if you skip generation of all the tests.

17. 18. 19.

If you have the enterprise license, you can select whether to save all the bad records. If you choose to save all the bad records, select whether to save the bad records in a flat file or the Data Validation Option schema. Click Create. Data Validation Option displays the summary of table pairs and tests to create and to skip.

20.

Click Ok. Data Validation Option generates all the possible table pairs and tests.

Generating Tests for Table Pairs


Use the Generate Tests option to generate the associated count tests and value tests of an existing table pair. 1. 2. 3. 4. In the Navigator, select the table pair. Right-click on the table pair and select Generate Value Test. Select whether to compare columns by name or position. If you select compare columns by position, select the column to which you want to skip in the table.

Automatic Test Generation

59

5.

Click Yes. Data Validation Option generates the tests for the table pair.

Compare Columns by Position


You can generate tests that compares columns of the tables in a table pair by position instead of name. The following examples describe the various scenarios when you compare columns by position: Table Pair with an SQL View and a Table Suppose you have an SQL view, sample_view and a table, sample1 as Table A and Table B of a table pair. The following table lists the columns in sample_view and sample1:
sample_view SampleA.column1 SampleA.column2 SampleB.column1 SampleB.column2 sample1 column1 column2

You need to compare SampleB.column1 and SampleB.column2 with column1 and column2. Select compare columns by position. Select Table A and enter 2 as the offset. Table Pair with tables that have the same number of columns and different names Suppose you have SampleA as Table A and SampleB as Table B of a table pair. The following table lists the columns in sample_view and sample1:
SampleA column1 column2 column3 SampleB col1 col2 col3

You need to compare column1, column2, and column3 in SampleA with col1, col2, and col3 in SampleB. Select compare columns by position. Enter 0 as the offset. You can select Table A or Table B. Table Pair with tables that have different number of columns Suppose you have SampleA as Table A and SampleB as Table B of a table pair.

60

Chapter 8: Tests for Table Pairs

The following table lists the columns in sample_view and sample1:


SampleA column1 column2 column3 SampleB col1 col2 col3 column1 column2 column3

You need to compare column1, column2, and column3 in SampleA with column1, column2, and column3 in SampleB. Select compare columns by position. Select Table B and enter 3 as the offset.

Bad Records
Bad records are the records that fail a value or a set test. When you select a test on the Tests tab, the bad records appear on the Results tab. The columns that appear on the Results tab differ depending on the test. Aggregate Tests Aggregate tests do not display bad records. The Results tab displays the test result value from Field A, the test result value from Field B, and the comparison operator. Value Tests Value tests display the following columns for each bad record:
The key for Table A The field or expression from Table A being compared The key for Table B The field or expression from Table B being compared

Set Tests Set tests display the following columns for each bad record:
The result from Table A The result from Table B

Note: If you compare two fields where one field has a value and the other is empty, Data Validation option considers the record as a bad record. If the repository is on Oracle, the database stores the empty field as NULL. A bad record with NULL value in an Oracle repository can be either NULL or an empty field.

Bad Records

61

CHAPTER 9

Single-Table Constraints
This chapter includes the following topics:
Single-Table Constraints Overview, 62 Single Table Properties, 62 Adding a Single Table, 67 Editing Single Tables, 68 Deleting Single Tables, 68 Viewing Overall Test Results, 69

Single-Table Constraints Overview


Use a single-table constraint to run tests on a single table. Single-table constraints define valid data within a table. You can enforce valid values, aggregates, formats, and uniqueness. For example, you might want to verify that no annual salary in an employee table is less than $10,000. Errors in complex logic often manifest themselves in very simple ways, such as NULL values in the target. Therefore, setting aggregate, value, NOT_NULL, UNIQUE, and FORMAT constraints on a target table is a critical part of any testing plan. To run single-table constraints, you must create a single table. You can select a relational table, flat file, lookup view, SQL view, or join view as a single table.

Single Table Properties


You can view single table properties by selecting a table in either the Navigator or the Single Tables tab and viewing the properties. Most properties come from the values entered in the Single Table Editor . Other properties come from the tests set up for and run on the table. Edit single table properties in the Single Table Editor dialog box. Data Validation Option displays the Single Table Editor dialog box when you add or edit a single table. The properties vary depending on the type of object you select for the table.

62

The following table describes the single table properties:


Property Table Conn Optimize in Database Description The table name. Connection properties for the table. Controls which test logic that Data Validation Option converts to a PowerCenter mapping and which test logic it pushes to the database. You can select one of the following options: - Default - Data Validation Option converts all test logic to a PowerCenter mapping and applies sorting to the data source. - WHERE clause, Sorting, and Aggregation in DB - Data Validation Option pushes the WHERE clause, sorting logic for joins, and all aggregate tests to the database. Data Validation Option converts all other test logic to a PowerCenter mapping. - Already Sorted Input - PowerCenter mapping does not sort the input. If the data source is not sorted, tests may fail. Allows you to limit the number of records that the PowerCenter Integration Service reads from the database. Enter a valid PowerCenter Boolean expression or an SQL WHERE clause without the WHERE keyword. Single table description. By default, Data Validation Option uses the table name. Identifier for the single table that you can use when you run Data Validation Option tests at the command line. Primary key column or columns for the table.

Where clause

Description

External ID

Primary Key Column

Single table properties function in the same way as table pair properties.

RELATED TOPICS:
Table Pair Properties on page 41

Connection Properties
Choose the connection properties based on the data source type. You must select a connection for all the data sources except for flat files. For flat files, you must provide the source directory and the file name. Connections are PowerCenter connection objects created in the Workflow Manager.

Single Table Properties

63

Relational Connection Properties


Choose the relational connection properties for Microsoft SQL Server, Oracle, IBM DB2, Netezza, and PowerExchange for DB2 data sources. Configure the following properties when you select a relational data source:
Property Connection Override Owner Name Description PowerCenter connection object to connect to the relational data source. Override the database name and schema of the source. For example, a Microsoft SQL Server table is identified by <database>.<schema>.<table>. To override the database and the schema, enter <new database name>.<To change the schema, enter <new schema name> in the text box. You cannot change only the database name.

SAS and Salesforce Connection Properties


You can use a SAS or Salesforce data source in Data Validation Option. Select the PowerCenter connection object when you select an SAS or Salesforce data source. Note: You cannot override the owner name for SAS and Salesforce data sources.

SAP Connection Properties


You must configure the SAP authentication information to use SAP data sources. Configure the following properties when you select an SAP data source:
Property Connection SAP User Name Description PowerCenter connection object to connect to the SAP data source. SAP source system connection user name. Must be a user for which you have created a source system connection. Password for the user name. SAP client number. Language you want for the mapping. Must be compatible with the PowerCenter Client code page. Data Validation Option does not authenticate the value. Ensure that you enter the correct value so that the tests run successfully. SAP Connect String Type A or Type B DEST entry in saprfc.ini.

SAP Password SAP Client SAP Language

SAP Data Sources in Data Validation Option


You cannot override the owner name for an SAP data source. Data Validation Option uses the stream mode for installation of the ABAP programs and cannot use the FTP mode. SAP data sources must not contain the backslash (/) character in the field names.

64

Chapter 9: Single-Table Constraints

Flat File Connection Properties


You can use the flat file data sources in the PowerCenter repository. Configure the following properties when you select a flat file data source:
Property Source Dir Source File Description Directory that contains the flat file. The path is relative to the machine that hosts Informatica Services. File name with the file name extension.

Bad Records Configuration


If you have the enterprise license, you can store up to 16,000,000 bad records per test to perform advanced analysis and up to 1000 bad records for reporting. If you have the standard license, you can store up to 1000 bad records for reporting and analysis. For reporting, you can set the number of max bad records in the mapping properties in the Preferences dialog box. The default number of bad records is 100. You can set up to 1000 bad records for reporting. For advanced analysis, you can enter the max bad record for detailed analysis along with the file delimiter in the Detailed Error Rows Analysis section in mapping properties. The default number of bad records is 5000. You can set up to 16,000,000 bad records. At the table pair or single table level, you can choose to store all the bad records from the tests for the table pair or single table. Select Save all bad records for test execution in the Advanced tab when you create or edit a table pair or single table. Data Validation Option stores the following tests for table pairs:
Value Outer Value Set

Data Validation Option stores the following constraint tests for single tables:
Value Unique

You must select whether you want to store the bad records in a flat file or a table in the Data Validation Option schema. If you store bad records in a flat file, you can optionally enter the name of the file. Data Validation Option appends test information to the name and retains the file extension. Note: If you modify the file delimiter in the preferences file, run the InstallTests command with the forceInstall option for the existing table pairs or single tables that you already ran. You can also edit and save the table pair or single table from the Data Validation Option Client before you run the test. If you modify the bad records value, you need not reinstall the tests.

Bad Records in Flat File


If you configure to store bad records in flat file, Data Validation Option creates the flat file in the machine that runs Informatica services. The flat files that Data Validation Option generates after running the tests are stored in the following folder:<PowerCenter installation directory>\server\infa_shared\TgtFiles

Single Table Properties

65

You can modify the folder from the Administrator tool. Edit the $PMTargetFileDir property for the PowerCenter Integration Service. Data Validation Option generates a folder for each of the table pair or single table. The name of the table pair or single table is in the following format: TablePairName_TestRunID or SingleTableName_TestRunID Data Validation Option creates flat files for each test inside the folder. The name of the flat file is in the following format: <user defined file name>_ TestCaseType_TestCaseColumn A_TestCaseColumnB_TestCaseIndex.<user defined
file extension>

You can get the Test Case Index from the Properties tab and the Test Run ID from the results summary of the test in the detail area. You can get the Table Pair ID/Single Table ID from the Table Pair or Single Table properties. For example, you enter the file name as BAD_ROWS.txt when you configure the table pair or single table and you run an outer value test on fields FIRSTNAME and FIRSTNAME. The test-case index is 1 and the test-fields are expressions. The bad records file after you run the test is of the format, BAD_ROWS_OUTER_VALUE_ExprA_ExprB_1.txt. Data Validation Option supports all the file delimiters that PowerCenter supports. When you enter non-printable characters as delimiters, you should enter the corresponding delimiter code in PowerCenter. When you import these files in PowerCenter, you have to manually create the different data fields since the code appears in the place of delimiter in the bad records file. Caution: Data Validation Option uses comma as a delimiter if there a multiple primary keys. You must not use comma as a file delimiter. When you run the tests for the table pair or single table, Data Validation Option stores the details of the bad records along with the following format:
Table Pair Name Table A Name Table A Connection Table B Name Table B Connection Test Definition Test Run Time Test Run By User Key A[],Result A[],Key B[],Result B[]

If the tests pass, Data Validation Option still creates a flat file without any bad records information.

Bad Records in Database Schema Mode


If you choose to save detail bad records in the Data Validation Option schema, the bad records are written into the detail tables. The following table describes the tables to which Data Validation Option writes the bad records based on the type of test:
Test Type Value, Outer Value, Value Constraint Unique Constraint Set Table ALL_VALUE_RESULT_DETAIL Columns TEST_RUN_ID, TEST_CASE_INDEX, KEY_A, VALUE_A, KEY_B, VALUE_B TEST_RUN_ID, TEST_CASE_INDEX, VALUE TEST_RUN_ID, TEST_CASE_INDEX, VALUE_A, VALUE_B

ALL_UNIQUE_RESULT_DETAIL ALL_SET_RESULT_DETAIL

Note: You must ensure that the database table has enough table space to hold all the bad records.

66

Chapter 9: Single-Table Constraints

The test details of all the tests that you ran in Data Validation Option is available in the TEST_CASE table. The test installation details of the tests are available in the TEST_INSTALLATION table. You can obtain the TEST_ID of a test from the TEST_INSTALLATION table. You need TEST_ID of a test to query the complete details of a test from the TEST_CASE table. You can get the Test Case Index from the Properties tab and the Test Run ID from the results summary of the test in the detail area. You can get the Table Pair ID/Table ID from the Table Pair or Single Table properties. Foe example, you ran a table pair with a value test and outer value test. The following SQL query is a sample query to retrieve the information of bad records of a test with Test Case Index as 5.
select ALL_VALUE_RESULT_DETAIL.*,TEST_CASE.* from ALL_VALUE_RESULT_DETAIL , TEST_RUN, TEST_INSTALLATION, TEST_CASE where ALL_VALUE_RESULT_DETAIL.TEST_RUN_ID=TEST_RUN.TEST_RUN_ID and TEST_RUN.TEST_INSTALLATION_ID=TEST_INSTALLATION.TEST_INSTALLATION_ID and TEST_INSTALLATION.TABLE_PAIR_ID=TEST_CASE.TEST_ID and ALL_VALUE_RESULT_DETAIL.TEST_CASE_INDEX=TEST_CASE.TEST_CASE_INDEX and ALL_VALUE_RESULT_DETAIL.TEST_RUN_ID=220

Parameterization
If you have the enterprise license, you can perform incremental data validation through parameterization. You can use parameters in the WHERE clause in Table Pairs and Single Tables. Before you use a parameter in the WHERE clause, you must enter the name of the parameter file and add the parameters in the Advanced tab of a table pair or single table definition. You must specify the data type, scale, and precision of the parameter. After you add the parameters, you can use them in a WHERE clause to perform incremental validation. You can validate the expression with the parameters that you enter in a table pair or single table. The parameter file must be in a location accessible to the Data Validation Option Client. The parameters in the parameter file must be in the format, $$<parameter name>=value. Ensure that the parameter that you add in a table pair or single table is available in the parameter file. When you run a test, Data Validation Option looks up the value of the parameter from the parameter file and runs the WHERE clause based on that value. If the Informatica Services run on a Windows machine, you can place the parameter files in a folder on the server. Ensure that the Data Validation Option Client can access the folder. In the Data Validation Option Client, enter the parameter file location as a network path to the parameter file in the server in the following format: \\<server
machine host name>\<shared folder path>\parameter file name

If the Informatica Services run on a UNIX machine, you can place the parameter files in a folder on the server. Install DVOCmd on the server. You can configure parameterization on the Data Validation Option Client and provide the absolute path of the parameter file and run the tests from the server with DVOCmd. Alternatively, you can ensure that the Data Validation Option Client can access folder and enter the parameter file location as a network path to the parameter file in the server. Run the tests from the Data Validation Option Client.

Adding a Single Table


You can create a single table from the file menu or from the shortcut in the menu bar. 1. Select the folder to which you want to add the single table.

Adding a Single Table

67

2.

Click on the single table shortcut on the menu bar or click on File > New > Single Table. The Single Table Editor window appears.

3.

Browse and select the data source that you want to use the single table. You can search for a data source by name or path. You can search for lookup views, sql views, and join views only with their names.

4. 5. 6. 7.

Click Edit and configure the connection properties for the table. Enter the WHERE clause you want to execute on the data source. Enter the description for the single table. Enter the external id for the single table. You can use the external id to execute the single table from the command line.

8.

If your data source is relational, you can choose whether to execute the WHERE clause within the data source. If you choose to execute the WHERE clause within the database, PowerCenter Integration Service passes the WHERE clause to the database for execution before data loading.

9.

Select the database optimization level. The following options are available:
Default. Data Validation Option creates a PowerCenter mapping based on the test logic and applies

sorting on the data source.


WHERE clause, sorting, and aggregation in DB. Data Validation Option pushes the WHERE clause and

sorting to the database. Applicable for relational data sources.


Already sorted input. Data Validation Option creates a PowerCenter mapping based on the test logic. If the

data source is not sorted, the tests may fail. 10. Select the primary key for the table in the Key Column pane.

Editing Single Tables


To edit a single table, right-click it in the Navigator or Single Tables tab, and select Edit Single Table . You can also edit a single table by double-clicking it in the Single Tables tab. When you edit a single table, the Single Table Editor dialog box opens

Deleting Single Tables


To delete a single table, right-click it in the Navigator or Single Tables tab, and select Delete Single Table . You can also delete a single table by selecting it and pressing the Delete key. When you delete a single table, Data Validation Option deletes all of its tests. Data Validation Option does not delete lookup or SQL views used in the single table.

68

Chapter 9: Single-Table Constraints

Viewing Overall Test Results


When you select a single table in the navigator, all the tests run on the single table appears on the right pane. Also, the run summary of all the tests appear on the top of the page. Select a test to view the details of that test in the bottom pane. You can view the test properties in the Properties tab and the test results in the Results tab. You view important tests details like Test Case ID in the Properties tab along with other test details.

Viewing Overall Test Results

69

CHAPTER 10

Tests for Single-Table Constraints


This chapter includes the following topics:
Tests for Single-Table Constraints Overview, 70 Test Properties, 70 Adding Tests, 74 Editing Tests, 74 Deleting Tests, 75 Running Tests, 75 Bad Records, 75

Tests for Single-Table Constraints Overview


Single-table constraints are tests are based on a single table. Data Validation Option allows you to run an aggregate test or a VALUE test on single tables. Note that there are no set tests nor an OUTER_VALUE test for single tables. However, there are some additional tests available for single tables that you cannot create for a table pair. Most single-table constraints allow you to enter a constraint value for the test. The constraint value defines the value or values to which you want to compare the values in a field. For example, you might want to verify that a SALARY field contains values greater than $10,000. Enter the minimum salary as the constraint value. Note: When you run tests, the target folder must be closed in the Designer and Workflow Manager. If the target folder is open, Data Validation Option cannot write to the folder, and the tests return an error.

Test Properties
When you select a test in the Navigator or in the Tests tab, the properties for that test appear in the Properties area. Most properties come from the values you enter in the Single Table Test Editor dialog box. Other properties apply to the most recent test run. Edit test properties in the Single Table Test Editor dialog box when you add or edit a test.

70

The following table describes the test properties:


Property Function Field Condition Operator Constraint Value Threshold Description The test you run such as COUNT, COUNT_DISTINCT, VALUE, or NOT_NULL. The field that contains the values you want to test. Filter condition for the test. Enter a valid PowerCenter Boolean expression. The operator that defines how to compare each value in the field with the constraint value. The value or values you want to compare the field values to. The allowable margin of error for an aggregate or value test that uses the approximate operator. You can enter an absolute value or a percentage value. The number of records that can fail comparison for a test to pass. You can enter an absolute value or a percentage value. Ignores case when you run a test on string data. Ignores trailing spaces when you run a test that on string data. Data Validation Option does not remove the leading spaces in the string data. Information about a test. Data Validation Option displays the comments when you view test properties in the Properties area. Allows you to enter an expression for the field. The datatype for the expression if the field is an expression. The precision for the expression if the field is an expression. The scale for the expression if the field is an expression.

Max Bad Records

Case Insensitive Trim Trailing Spaces

Comments

Field is Expression Datatype Precision Scale

Tests
The following table describes the single table tests:
Test COUNT Description Compares the number of non-null values for the selected field to the constraint value. This test works with any datatype. Compares the distinct number of non-null values for the selected field to the constraint value. This test works with any datatype except binary. Compares the total number of values for the selected field to the constraint value. This test counts nulls, unlike the COUNT and COUNT_DISTINCT tests. This test works with any datatype. Compares the minimum value for the selected field to the constraint value. This test works with any datatype except binary.

COUNT_DISTINCT

COUNT_ROWS

MIN

Test Properties

71

Test MAX

Description Compares the maximum value for the selected field to the constraint value. This test works with any datatype except binary. Compares the average value for the selected field to the constraint value. This test can only be used with numeric datatypes. Compares the sum of the values for the selected field to the constraint value. This test can only be used with numeric datatypes. Examines the values for the field, row by row, and compares them to the constraint value. This test works with any datatype except binary. Determines whether the values in the field match the pattern in the constraint value. The PowerCenter Integration Service uses the REG_MATCH function for this test. This test cannot be used with binary datatypes. Confirms that the value in the field is unique. This test does not use a constraint value. This test cannot be used with binary datatypes. Confirms that the value in the field is not null. This test does not use a constraint value. This test cannot be used with binary datatypes. If the value in the field is a string value, this test confirms that the value in the field is not null or an empty string. If the value in the field is a numeric value, this test confirms that the value in the field is not null or zero. This test does not use a constraint value. This test cannot be used with datetime or binary datatypes.

AVG

SUM

VALUE

FORMAT

UNIQUE

NOT_NULL

NOT_BLANK

Field
To create a single-table constraint, you must select the field that contains the values you want to test. The fields available in the single table appear in the Field drop-down list. Select a field from the list.

Condition
You can filter the values for the test field in a VALUE, FORMAT, NOT_NULL, or NOT_BLANK test. Data Validation Option does not test records that do not satisfy the filter condition. For example, you want to test rows in an ORDERS table only if the store ID number is not 1036. Enter STORE_ID <> 1036 in the Condition field. Data Validation Option does not check the condition syntax. Any valid PowerCenter expression, including expressions that use PowerCenter functions, is allowed. If the PowerCenter syntax is not valid, a mapping installation error occurs when you run the test. Enter the filter condition in the Condition field. Because the PowerCenter Integration Service processes the filter condition, it must use valid PowerCenter syntax. Do not include the WHERE keyword.

Operator
The operator defines how to compare the test result for the field with the constraint value. Enter an operator for aggregate, VALUE, and FORMAT tests.

72

Chapter 10: Tests for Single-Table Constraints

The following table describes the operators available in the Operator field:
Operator = Definition Equals Description Implies that the test result for the field is the same as the constraint value. For example, SUM(field) is the same as the constraint value. <> Does not equal Implies that the test result for the field is not the same as the constraint value. Implies that the test result for the field is less than the constraint value. Implies that the test result for the field is less than or equal to the constraint value. Implies that the test result for the field is greater than the constraint value. Implies that the test result for the field is greater than or equal to the constraint value. Implies that the test result for the field is approximately the same as the constraint value. The approximate operator requires a threshold value. It only applies to numeric datatypes Between Is between two values entered Implies that the test result for the field is between the two constants entered as the constraint value. This operator is generally used for numeric or datetime datatypes. Not Between Is not between two values entered Implies that the test result for the field is not between the two constants entered as the constraint value. This operator is generally used for numeric or datetime datatypes. In Is included in a list of values entered Is not included in a list of values entered Implies that the test result for the field is in the list of constants entered as the constraint value. Implies that the test result for the field is not in the list of constants entered as the constraint value.

<

Is less than

<=

Is less than or equal to

>

Is greater than

>=

Is greater than or equal to

Is approximately the same as

Not In

Note: Data Validation Option compares string fields using an ASCII table.

RELATED TOPICS:
BIRT Report Examples on page 119

Constraint Value
The constraint value represents a constant value to which you want to compare the field values. For example, you might want to verify that all values in the ORDER_DATE field fall between January 1, 2010 and December 31, 2010. Or, you might want to verify that the minimum ORDER_ID number is greater than 1000. The constraint value must be a string, numeric, or datetime constant. The datatype of the constraint value depends on the test.

Test Properties

73

The following table lists the constraint value datatype allowed for each test:
Test COUNT, COUNT_DISTINCT, COUNT_ROWS MIN, MAX, SUM, VALUE FORMAT AVG UNIQUE, NOT_NULL, NOT_BLANK Datatype Integer Same as the Field datatype. String Double These tests do not use a constraint value.

Enter a constraint value in the Constraint Value field. Enter a constant or list of constants separated by commas. The number of constants you enter as the constraint value depends on the operator you use: Arithmetic operator such as =, <>, or ~ Enter a single constant. Between or Not Between operator Enter two constants separated by a comma. In or Not In operator Enter multiple constants separated by commas. Enclose each string value, datetime value, or format pattern within single quotes. Datetime values must match the PowerCenter standard datetime format of MM/DD/YYYY HH24:MI:SS.

Remaining Controls on Test Editor


The remaining controls on the Single Table Test Editor are used in the same manner that they are used for table pairs.

Adding Tests
To add a test to a single table, right-click the name of the table in the Navigator or Single Tables tab, or right-click in the Tests tab, and select Add Constraint Test. The Single Table Test Editor dialog box opens.

Editing Tests
To edit a test, right-click the test in the Navigator or Tests tab, and select Edit Test. You can also double-click the test name. The Single Table Test Editor dialog box opens.

74

Chapter 10: Tests for Single-Table Constraints

Deleting Tests
To delete a test, right-click the test in the Navigator or the Tests tab, and select Delete Test. You can also use the Delete key.

Running Tests
Use one of the following methods to run tests:
Select one or more table pairs and click Run Tests. Right-click a folder in the Navigator and select Run Folder Tests. Right-click a test in the Tests tab and select Run Selected Tests.

Data Validation Option runs all tests for a single table together. You cannot run tests individually unless only one test is set up for the table. If you select an individual test, Data Validation Option runs all tests for the single table. After you run a test, you can view the results on the Results tab. Data Validation Option uses the following logic to determine whether a test passes or fails:
An aggregate test is calculated as value <operator> constraint. If this relationship is true, the test passes. If

the operator chosen is approximate, then the test is calculated as ABS(value-constraint) <= Threshold.
A VALUE test must produce fewer or an equal number of records that do not match compared to the threshold

value. If there is no threshold value and there are no records that do not match, the test passes.
A FORMAT test is calculated as value <operator> constraint. If this relationship is true, the test passes. A UNIQUE, NOT_NULL, or NOT_BLANK test passes if the field value is unique, is not null, or is not blank,

respectively. For string values, not blank means the string is not null or empty. For numeric values, not blank means the number is not null or 0. When you run tests, the target repository folders must be closed in the Designer or Workflow Manager. If the target folders are open, Data Validation Option cannot write to the folders, and the tests fail.

Bad Records
When you select a test on the Tests tab, the records that fail a test appear on the Results tab. Different columns appear on the Results tab depending on the test. Aggregate Tests Aggregate tests do not display bad records. The Results tab displays the test result value. VALUE, FORMAT, NOT_NULL, and NOT_BLANK Tests These tests display the following columns for each bad record:
The key or expression for the field The field value

UNIQUE Tests UNIQUE tests display the non-unique field values.

Deleting Tests

75

CHAPTER 11

SQL Views
This chapter includes the following topics:
SQL Views Overview, 76 SQL View Properties, 76 Adding SQL Views, 78 Editing SQL Views, 78 Deleting SQL Views, 78

SQL Views Overview


SQL views facilitate the use of more complex functionality for single tables and table pairs. An SQL view allows you to use several tables and several calculations in a query to produce a set of fields that you can use as a table in a single table or table pair. This functionality is similar to the SQL override in PowerCenter or a view in a relational database. You can use any valid SQL statement to create an SQL view.

SQL View Properties


You can view SQL view properties by selecting an SQL view in either the Navigator or the SQL Views tab and viewing the properties. Most properties come from the values entered in the SQL View Editor . Other properties come from the tests set up for and run on the SQL view. Edit SQL view properties in the SQL View Editor dialog box when you add or edit an SQL view. The following table describes the SQL view properties:
Property Description Table Definitions Description SQL view description. Tables to create the SQL view. If you identify the table with an alias, enter the alias name with the table name. All tables you use in the SQL view must exist in the same database. PowerCenter connection for the tables

Connection

76

Property Column Definition

Description The columns that make up the SQL view. Data Validation Option imports all columns from the tables you select. You can create, delete, and rearrange columns. SQL statement you run against the database to retrieve data for the SQL view. Information about an SQL view. Data Validation Option displays the comment when you view the SQL view in the Properties area.

SQL Statement Comment

Description
Enter a description so you can identify the SQL view. Data Validation Option displays the description in the Navigator and on the SQL Views tab. The description can include spaces and symbols.

Table Definitions and Connection


To provide Data Validation Option with the information it needs to create an SQL view, you must specify the tables that the SQL statement is based on and the corresponding database connection. When you provide the tables and connection information, Data Validation Option can access the metadata that is necessary for the view to function correctly. To add a table, click Add Table. The Choose Data Source dialog box opens. This dialog box displays all of the relational tables available in the repositories. You can sort information in this dialog box by clicking the column headers. You can reduce the number of items to select by typing one or more letters of the table, file, or view name in the Search field. Select a table and click Select. All of the tables you use in an SQL view must exist in the same database. Note: You cannot create an SQL view with self-join. Use a join view to create self-joins. If you identify the table with an alias in the SQL statement you use to create the view, enter the alias name next to the table name. When you finish adding tables, select the PowerCenter connection for the tables from the Connection list.

Column Definition
After you specify the tables on which the SQL view is based, you must specify the columns that make up the view. The number of columns you define for the view must match the SQL statement. To import the columns from the tables you select, click Populate. Data Validation Option imports all columns in the tables. Delete the columns that you do not want to use. You can rearrange the columns in the view. You can also create a column. To do this, open a column field in the Column and Expression Definition list and select Column. Enter the column name in the SQL View Column Editor dialog box. You must also specify the datatype, precision, and scale for the column. The datatype, precision, and scale information must match the transformation datatype that Informatica uses for the column. For datetime, string, and integer datatypes, the scale must be zero.

SQL Statement
Enter an SQL statement to retrieve data for the SQL view.

SQL View Properties

77

The statement that you enter runs as a query against the database, so it must use valid database syntax. Also, the columns that you enter in the SELECT statement must match the columns in the Column Definition list in number, position, and datatype. To avoid errors when you run tests, test the SQL statement in the database before you paste it into the SQL Statement field. Data Validation Option does not check the SQL statement syntax. You can call a stored procedure from an SQL view. The connection that you specify in the SQL view for the source must have permission on the stored procedure.

Comment
You can associate a comment with the view. Data Validation Option displays the comment when you view the SQL view in the Properties area.

Adding SQL Views


To create an SQL view, right-click SQL Views in the Navigator or right-click in the SQL Views tab, and select Add SQL View. The SQL View Editor dialog box opens.

Editing SQL Views


To edit an SQL view, right-click the SQL view in the Navigator or SQL Views tab, and select Edit SQL View. The SQL View Editor dialog box opens.

Deleting SQL Views


To delete an SQL view, right-click the SQL view in the Navigator or SQL Views tab, and select Delete SQL View. You can also select the SQL view and press the Delete key. When you delete an SQL view, Data Validation Option deletes all table pairs, single tables, and tests that use the SQL view.

78

Chapter 11: SQL Views

CHAPTER 12

Lookup Views
This chapter includes the following topics:
Lookup Views Overview, 79 Lookup View Properties, 80 Adding Lookup Views, 81 Editing Lookup Views, 81 Deleting Lookup Views, 82 Lookup Views Example , 82 Joining Flat Files or Heterogeneous Tables using a Lookup View, 83

Lookup Views Overview


Data Validation Option lookup views allow you to test the validity of the lookup logic in your transformation layer. Lookup views allow you to validate the process of looking up a primary key value in a lookup, or reference, table using a text value from a source, and then storing the lookup table primary key in the target fact table. For example, a product name in the source system might be in a dimension that serves as the lookup table. The data transformation process involves looking up the product name and placing the primary key from the lookup table in the target fact table as a foreign key. You must validate the product name in the source table against the foreign key in the target table. The following table lists the keys used in the example:
Source Table source_id product_name Lookup Table lookup_id product_name Target Table target_id source_id lookup_id

The source table product name field is found in the lookup table. After the product name is found, the primary key from the lookup table is stored in the target table as a foreign key. To test the validity of the lookup table foreign key in the target table, complete the following tasks: 1. Create the lookup view. Add the source table and the lookup table to the lookup view. Then create a relationship between the product name in the source and lookup tables.

79

2. 3.

Create a table pair with the lookup view and the table that is the target of the data transformation process. Join the tables on the source table primary key, which is stored in the target table as a foreign key. Create an OUTER_VALUE test that compares the primary key of the lookup table to the lookup ID that is stored as a foreign key in the target table.

The OUTER_VALUE test checks the validity of the lookup table primary key stored in the target table against the contents of the source table. The test also finds any orphans, which are records in the target table that do not match any records in the lookup table.

Lookup View Properties


Lookup view properties describe the properties of source and lookup tables. You can view lookup view properties by selecting a lookup view in either the Navigator or the Lookup Views tab and viewing the properties. Most properties come from the values entered in the Lookup View Editor dialog box. Other properties come from the tests set up for and run on the lookup view. Edit lookup view properties in the Lookup View Editor dialog box when you add or edit a lookup view. The following table describes the lookup view properties:
Property Source Table Source Conn Override Owner Name Source Dir Description Source table name. PowerCenter connection for the source table. Overrides the schema or owner name for the source table. Source file directory if the source table is a flat file. The path is relative to the machine that hosts Informatica Services. File name, including file extension, if the source table is a flat file. Lookup table name. PowerCenter connection for the lookup table. Source file directory if the lookup table is a flat file. The path is relative to the machine that hosts Informatica Services. File name, including file extension, if the lookup table is a flat file. Lookup view description. The fields on which the source table and lookup table are joined.

Source File Lookup Table Lookup Conn Lookup Source Dir

Lookup Source File Description Source to Lookup Relationship

Selecting Source and Lookup Tables


A lookup view consists of a source table and a lookup table. Use the Browse button and the Select Data Sources dialog box to select the source and lookup table in the same way that you select tables for table pairs.

80

Chapter 12: Lookup Views

Selecting Connections
Select the correct connections for the source and lookup tables in the same way that you select connections for table pairs.

Overriding Owner Name


You can override the owner name for the source table, but not for the lookup table. To specify a different owner or schema name for the lookup table, create a connection in the Workflow Manager, and use that connection for the lookup table.

Source Directory and File


If either the source or lookup tables are flat files, specify the source directory and file name plus file extension, in the same way that you specify source directories and file names for table pairs.

Description
Data Validation Option automatically generates the description for a lookup view based on the tables you select. You can change the description.

Source to Lookup Relationship


In the source and lookup tables, select the values you want to look up in the lookup table.

Adding Lookup Views


To create a lookup view, right-click Lookup Views in the Navigator or right-click in the Lookup Views tab, and select Add Lookup View. The Lookup View Editor dialog box opens. Select the source table and lookup table, and create the lookup relationship between them. That is, select the field to look up in the lookup table. You cannot use expressions for lookup view join fields. The lookup view you create includes fields from both the source and the lookup table, joined on the lookup relationship fields. Data Validation Option precedes the source table field names with "S_." You can use the lookup view to validate data in the target table, where the lookup table primary key is stored as a foreign key.

Editing Lookup Views


To edit a lookup view, right-click the lookup view in the Navigator or SQL Views tab, and select Edit Lookup View. The Lookup View Editor dialog box opens. You cannot modify the sources in a lookup view. You can modify the lookup relationship.

Adding Lookup Views

81

Deleting Lookup Views


To delete a lookup view, right-click the lookup view in the Navigator or Lookup Views tab, and select Delete Lookup View. You can also select the lookup view and press the Delete key. When you delete a lookup view, Data Validation Option deletes all table pairs and tests that use the lookup view.

Lookup Views Example


Use a lookup view to test the validity of the foreign key stored in the target, or fact, table and to confirm that there are no orphans. The following tables display sample data that is typical of data used to build a target table. Source Table
ORDER_ID 101 102 103 PRODUCT_NAME iPod Laptop iPod AMOUNT 100 500 120

Product Lookup Table


LKP_PRODUCT_ID 21 22 LKP_PRODUCT_NAME iPod Laptop

Target Table
TARGET_ID 1 2 3 ORDER_ID 101 102 103 LKP_PRODUCT_ID 21 22 21 AMOUNT 100 500 120

To test the validity of the lookup table foreign key in the target table, perform the following steps: Create the lookup view. Create a lookup view with the source and lookup tables. The lookup relationship uses the product name fields in both the source and the lookup tables. The fields that are now included in the lookup view are listed below:
S_ORDER_ID S_PRODUCT_NAME

82

Chapter 12: Lookup Views

S_AMOUNT LKP_PRODUCT_ID LKP_PRODUCT_NAME

Note that the tables that originate in the source have "S_" as a prefix. Create the table pair. Create a table pair using the lookup view and the target table. Create a join relationship between the source table primary key and the same field stored in the target table as a foreign key as follows:
S_ORDER_ID and ORDER_ID

Create an OUTER_VALUE test. Create an OUTER_VALUE test. Compare LKP_PRODUCT_ID in both the lookup table and the target table as follows:
LKP_PRODUCT_ID and LKP_PRODUCT_ID

Joining Flat Files or Heterogeneous Tables using a Lookup View


One disadvantage of the SQL view is that it does not allow the use of flat files or heterogeneous database tables. You can join two heterogeneous sources with a lookup view. You can think of the source to lookup relationship as an inner join between the two tables or files.

Joining Flat Files or Heterogeneous Tables using a Lookup View

83

CHAPTER 13

Join Views
This chapter includes the following topics:
Join Views Overview, 84 Join View Data Sources , 84 Join View Properties, 85 Adding a Join View, 88 Join View Example, 90

Join Views Overview


A join view is a virtual table that contains columns from related heterogeneous data sources joined by key columns. Use a join view to run tests on several related columns across different tables. You can create a join view instead of multiple SQL views with joins. For example, the Employee table has employee details, Inventory table has sales details, and the customer table has customer details. If you create a join view with the tables, you can obtain a consolidated view of the inventory sold by the partner and the revenue generated by the employees associated with the partners. You can run tests with the join view to validate data across the tables. You can create a join view with different types of data sources. For example, you can create a join view with a flat file, SAP table, and an Oracle table. You can add a join view in a single table or a table pair. You can then create tests with the table pair or single table to validate the data in the join view. Add multiple data sources to the join view and add join conditions to define the relationship between data sources.

Join View Data Sources


You can create a join view with tables from multiple data sources such as relational, application, and flat files. You can use the following sources when you create a join view:
Oracle IBM DB2 Microsoft SQL Server Netezza

84

PowerExchange for DB2/zOS Flat files SAP SAS Salesforce.com

Join View Properties


Join view properties include table definitions and join conditions for each of the data source. The following table describes the join view properties:
Property Description Order Description Description of the join view. Sequence of data sources in the join view. You can join a data source with any of the preceding data sources. Data source that you add in the join view. Alias name for the table. Type of join used to join the data sources. The WHERE clause to filter rows in the data source. Left field of the join condition. Left field is the key column of the data source to which you create the join. Right field of the join condition. Right field is the key column of the data source for which you create the join.

Table Name Alias Join type Where Clause Left field

Right field

Connection Properties
Choose the connection properties based on the data source type. You must select a connection for all the data sources except for flat files. For flat files, you must provide the source directory and the file name. Connections are PowerCenter connection objects created in the Workflow Manager.

Join View Properties

85

Relational Connection Properties


Choose the relational connection properties for Microsoft SQL Server, Oracle, IBM DB2, Netezza, and PowerExchange for DB2 data sources. Configure the following properties when you select a relational data source:
Property Connection Override Owner Name Description PowerCenter connection object to connect to the relational data source. Override the database name and schema of the source. For example, a Microsoft SQL Server table is identified by <database>.<schema>.<table>. To override the database and the schema, enter <new database name>.<To change the schema, enter <new schema name> in the text box. You cannot change only the database name.

SAS and Salesforce Connection Properties


You can use a SAS or Salesforce data source in Data Validation Option. Select the PowerCenter connection object when you select an SAS or Salesforce data source. Note: You cannot override the owner name for SAS and Salesforce data sources.

SAP Connection Properties


You must configure the SAP authentication information to use SAP data sources. Configure the following properties when you select an SAP data source:
Property Connection SAP User Name Description PowerCenter connection object to connect to the SAP data source. SAP source system connection user name. Must be a user for which you have created a source system connection. Password for the user name. SAP client number. Language you want for the mapping. Must be compatible with the PowerCenter Client code page. Data Validation Option does not authenticate the value. Ensure that you enter the correct value so that the tests run successfully. SAP Connect String Type A or Type B DEST entry in saprfc.ini.

SAP Password SAP Client SAP Language

SAP Data Sources in Data Validation Option


You cannot override the owner name for an SAP data source. Data Validation Option uses the stream mode for installation of the ABAP programs and cannot use the FTP mode. SAP data sources must not contain the backslash (/) character in the field names.

86

Chapter 13: Join Views

Flat File Connection Properties


You can use the flat file data sources in the PowerCenter repository. Configure the following properties when you select a flat file data source:
Property Source Dir Source File Description Directory that contains the flat file. The path is relative to the machine that hosts Informatica Services. File name with the file name extension.

Database Optimization in a Join View


You can optimize the data sources in a join view for better performance. You can choose to select a subset of data in the data source and aggregate the rows for a relational data source. To improve read performance of the join view, you can provide a WHERE clause. The WHERE clause ensures that the data source uses a subset of data that satisfies the condition specified in the WHERE clause. Data Validation Option does not check the WHERE clause syntax. If the PowerCenter Integration Service executes the WHERE clause, any valid PowerCenter expression, including expressions that use PowerCenter functions, is allowed. If the PowerCenter syntax is not valid, a mapping installation error occurs. Use PowerCenter expression in cases where you do not push down the WHERE clause in to the data source. Use the following guidelines if the data source executes the WHERE clause:
Relational data source. The WHERE clause must be a valid SQL statement. If the SQL statement is not valid, a

runtime error occurs.


SAP data source. The WHERE clause must be a valid SAP filter condition in the ERP source qualifier. Salesforce data source. The WHERE clause must be a valid SOQL filter condition. SAS data source. The WHERE clause must be a valid Where clause Overrule condition in the SAS source

qualifier. You can choose one of the following optimization levels when you configure a data source in a join view:
Default. Data Validation Option converts all test logic to a PowerCenter mapping and applies sorting to the data

source.
WHERE clause, Sorting, and Aggregation in DB. Data Validation Option pushes the WHERE clause, sorting

logic for joins, and all aggregate tests to the database. Data Validation Option converts all other test logic to a PowerCenter mapping. You can choose this option with relational data sources.
Already Sorted Input. PowerCenter mapping does not sort the input. Ensure that you sort the data so that the

tests run successfully.

Join Types
You can choose join types to link fields in different data sources. You can create the following types of joins between two tables in a join view: Inner join Inner join creates a result table by combining column values in two tables A and B based on the join condition. Data Validation Option compares each row in A with each row in B to find all pairs of rows that satisfy the join condition.

Join View Properties

87

Left outer join Left outer join between tables A and B contains all records of the left table A, even if the join condition does not find any matching record in the right table B. The resulting join contains all rows from table A and the rows from table B that match the join condition. Right outer join Right outer join between tables A and B contains all records of the right table B, even if the join condition does not find any matching record in the left table A. The resulting join contains all rows from table B and the rows from table A. Full outer join A full outer join contains all the rows from tables A and B. The resulting join has null values for every column in the tables that does not have a matching row.

Alias in Join View


Alias of the data source in a join view helps you identify data sources that share the same name. By default, Data Validation Option assigns the data source name as the alias name. If you select a data source with the same name as any other data source in the join view, Data Validation Option appends the alias name with a number. Note: If you edit alias name after you a create a join view, Data Validation Option deletes the join conditions. Create the join conditions again to save the join view.

Join Conditions
You must configure join conditions for all the data sources in the join view. When you configure a join condition, select the data source in the table definition list and specify the left and right fields of the join condition. Data Validation Option displays the alias names of the data sources. The left field of the join condition consists of the output fields from the alias name you choose. The right field consists of the output fields of the data source for which you configure the join condition. You can create a join condition with any of the data sources in the previous rows in a join view. You can create multiple join conditions for the same data source. You cannot save a join view unless you create at least one valid join condition for all the data sources.

Adding a Join View


You must configure all the table definitions and join conditions when you configure a join view. 1. Click File > New > Join View. The Join View Editor dialog box appears. 2. 3. Enter a description for the join view. Click Add in the Table Definitions pane. The Choose Data Source dialog box appears. 4. 5. Select the data source. Configure the table definition for the data source.

88

Chapter 13: Join Views

6. 7.

Optionally, click Output Fields and select the fields that you want to view when you create the join condition and when you configure the tests. Configure multiple table definitions as required. You can change the sequence of data sources in the join view and delete table definitions. When you make any change to the data sources in the join view, you must recreate the join conditions.

8.

Configure join conditions for all the data sources. You can join a data source with any of the preceding data sources in the Table Definitions pane. You need not specify the join condition for the first data source.

Configuring a Table Definition


Configure a table definition after you add a data source to the join view in the Join View Editor dialog box. 1. 2. Select the data source in the Join View Editor dialog box. Click Edit in the Table Definitions pane. The Edit Table dialog box appears. 3. Enter an alias name for the data source. By default, the data source name appears as the alias name. 4. Select the join type for the data source. Select join types for all the data sources except the first data source in the join view. 5. 6. Configure the connection details for the table. The connection details vary depending on the data source type. Optionally, enter the WHERE clause. Data Validation Option runs the WHERE clause when it fetches data from the table. 7. 8. 9. If the table is relational, you can choose to push down the WHERE clause in the database. Select the database optimization level. Click OK.

Configuring a Join Condition


Add join conditions to specify the relationship between data sources. You can create join conditions for a data source with any other data source in the previous rows. 1. 2. Select the data source in the Join View Editor dialog box. Click Add in the Join Conditions pane. The Join Condition dialog box appears. 3. 4. Select the alias name of any of the data source in the previous rows. Select the left field of the join condition. The left field of the join condition consists of the output fields from the alias name you choose. You can also configure and validate a PowerCenter expression as the field. When you enter a field name in the expression, append the alias name followed by an underscore. For example, if the alias name of the table is customer1 and you want to use the CustIDfield in an expression, enter the expression as customer1_CustID > 100. 5. Select the right field of the join condition. The right field consists of the output fields of the data source for which you configure the join condition. You can also configure and validate a PowerCenter expression as the field. 6. Click OK.

Adding a Join View

89

Managing Join Views


You can edit and delete join views that you create in Data Validation Option 1. Click Join Views in the Navigator. Join views appear on the right pane. 2. Edit or delete the join view. If you modify the join view, re-create the join conditions in the join view. If you delete a join view, you must recreate the table pairs or single tables that contain the join view.

Join View Example


You need to validate the inventory sales done by the employees and partners to cross-check with the annual sales report. Account table in an SAP system holds the information of an employee account. Partner table is a Salesforce table that contains the information of the inventory sold to a partner associated with an employee. Inventory is a flat file that contains the details of the inventory sold. Account_History is an Oracle table that contains the history of activities done by the account. Current requirement is to validate data across the tables based on the inventory sales of an account. You also need to validate the account details with the historic account details to check for discrepancies. You can create a join view with the tables so that you can run single table tests to validate data.

Tables and Fields


The following table lists the join view tables and their columns:
Table Account (SAP) Columns Account table contains the following columns: - Account ID - Account Name - Collection - Inventory Partner contains the following columns: - Partner ID - Partner Name - Inventory - Cost - Associated Account ID

Partner (Salesforce)

90

Chapter 13: Join Views

Table Inventory (Flat file)

Columns Inventory table contains the following columns: - Inventory ID - Quantity - Associated Partner ID - Associated Account ID Account_History contains the following columns: - Historic Account ID - Account Name - Total Inventory - Total Collection

Account_History (Oracle)

Creating the Join View


1. 2. 3. 4. 5. Enter Account_Cumulative as the description. Add Account as the first table in the join view. Add Partner, Inventory, and Account_History tables in that order. Configure the table definitions with the required join types. Create join conditions for Partner, Inventory, and Account_History.

Table Definition Configuration


The following list describes the tables and their join types when you configure the table definitions: Partner You want to capture the details of partners associated with each account. Configure an inner join for the Partner table so that Data Validation Option adds the details of the partners for which there are corresponding accounts to the join view. Inventory You want to capture the details of the inventory sold by the partners. Configure an inner join for the Inventory table so that Data Validation Option adds the details of the inventory for which there are corresponding partners to the join view. Account_History You want to capture the historic details of an account. Configure a left outer join for the Account_History table so that Data Validation Option adds all the historic account details to the join view.

Adding Join Conditions


Configure the following join conditions for the tables: Partner Select Account as the join table. Select Account ID output field from the Account table as the left field and Associated Account ID output field from the Partner table as the right field of the join. Inventory Select Partner as the join table. Select Partner ID output field from the Partner table as the left field and Associated Partner ID output field from the Inventory table as the right field of the join. Account_History Select Account as the join table. Select Account ID output field from the Account table as the left field and Historic Account ID output field from the Account_History table as the right field of the join.

Join View Example

91

The following figure illustrates the formation of the join view with the table relationship:

After you create the join view, create a single table with the join view. Generate and run tests on the single table to validate the data in the join view.

Removing Table from the Join View


After you run the required tests for the join view with all the tables, you may want to remove the partner information and run tests solely for the account. If you remove the table Partner from the join view, the join condition for the table Inventory is no longer valid. You need to create another join condition for the table Inventory to save the join view. The following figure illustrates the broken join view:

Add a join condition to the table Inventory with Account as the join table. Join Account ID field in the Account table with Associated Account ID field in the Inventory table. The following figure illustrates the join view without the Partner table:

92

Chapter 13: Join Views

CHAPTER 14

Reports
This chapter includes the following topics:
Reports Overview, 93 Business Intelligence and Reporting Tools (BIRT) Reports, 93 Jasper Reports, 95 Jasper Report Types, 99 Dashboards, 101 Metadata Manager Integration, 102

Reports Overview
Data Validation Option stores all test definitions and test results in the Data Validation Option repository. You can run reports to display test definitions and results. You can use the BIRT reporting engine to generate the reports. If you have the enterprise license, you can use JasperReports Server to generate the reports. If you use the JasperReports Server, you can also generate dashboards at different levels. You can also view the metadata properties of the data source in a test if you configure Metadata Manager integration for the Data Validation Option repository.

Business Intelligence and Reporting Tools (BIRT) Reports


You can use the BIRT reporting engine available in Data Validation Option to generate reports. You can generate a report for one or more table pairs or single tables. You can generate the following BIRT reports in Data Validation Option: Summary of Testing Activities Summary of Testing Activities report displays the number of table pairs or single tables, the number of tests for each table pair or single table, and the overall test results.

93

Table Pair Summary Table Pair Summary report lists each table pair or single table with the associated tests. Data Validation Option displays each table pair or single table on a separate page. The report includes a brief description of each test and result. Detailed Test Results Detailed Test Results report displays each test on a separate page with a detailed description of the test definition and results. If one of the test sources is a SQL or a lookup view, the report also displays the view definition. Note: Report generation can take several minutes, especially when the report you generate contains hundreds of tests or test runs.

BIRT Report Generation


You can generate a report for one or more table pairs or single tables. You can also generate a report for all table pairs and single tables in a folder or a test. Right-click the objects for which you want to generate a report, and select Generate Report. The Report Parameters dialog box appears. The following table describes the report options:
Option User Description User that created the tests. By default, Data Validation Option generates a report for the tests that the current user creates and runs. Select All to display tests created and run by all users. Select a user name to display tests created and run by that user. Table pairs or single tables for which you want to run a report. You can generate a report on all table pairs and single tables, on the table pairs and single tables in a folder, or on a test. Data Validation Option gives you different options depending on what you select in the Navigator or details area. For example, you can generate a report on all table pairs and single tables or on the table pairs and single tables in the folder. Test runs for which you want to run a report. You can select the latest test runs or all test runs. Test results for which you want to run a report. You can select all results, tests that pass, or tests that do not pass. The test run dates. Enter the from date, to date, or both in the format MM/DD/YYYY. Data Validation Option displays the subtitle on each page of the report.

Table Pair

Recency Result Type

Run Dates Report Subtitle

Note: If you change the Data Validation Option repository after you configure BIRT reports, you must restart Data Validation Option Client to generate accurate reports.

SQL and Lookup View Definitions


If a test contains either an SQL view or a lookup view as a source, Data Validation Option prints the view definition as part of the report. You cannot print a report showing the definition of the view by itself because each view is tied to a specific test definition. For example, you create an SQL view, use it in a table pair, and run a test. You then update the view by changing the SQL statement, and re-run the test. Each test result is based on a different view definition, which is why the view definition must be tied to a specific result.

94

Chapter 14: Reports

Custom Reports
You can write custom reports against database views in the Data Validation Option schema. All Data Validation Option reports run against database views that are set up as part of the installation process. You can write custom reports based on the database views. Do not write reports against the underlying database tables because the Data Validation Option repository metadata can change between versions.

RELATED TOPICS:
Reporting Views Overview on page 129

Viewing Reports
Data Validation Option displays reports in a browser window. Use the arrows in the upper right corner to scroll through the pages of the report. To display a specific page, enter the page number in the Go To Page field. You can display or hide the table of contents. To do this, click the table of contents icon in the upper left corner of the report. When you display the table of contents, you can click a heading to display that section of the report. You can also print a report or export it to a PDF file. Click the print icon in the upper left corner of the report.

Jasper Reports
You can generate and view reports in the JasperReports Server if you have the enterprise license. You can use the JasperReports Server available with Informatica Services or a standalone JasperReports Server. If you use the JasperReports Server bundled with Informatica Services, you can use single sign-on when you view the Jasper reports. You can generate reports at the following levels:
Table pairs and Single tables Folders Data Validation Option user PowerCenter repository Views

When you generate reports, you can also select multiple table pairs and single tables, or folders. If you have a large data test data set use the report annotations to view the report details. Data Validation provides the following administrative reports:
External IDs Used In Table Pairs Views Used in Table Pairs/Tables Sources Used In Table Pairs/Tables/Views

Jasper Reports

95

The following table lists all the Jasper reports and the corresponding levels of reporting:
Report Name Run Summary Table Pair Summary Detailed Test Results Table Pair Run Summary Last Run Summary Percentage of Bad Rows Percentage of Tests Passed Tests Run Vs Tests Passed Total Rows Vs Percentage of Bad Rows Bad Rows Most Recent Failed Runs Failed Runs Failed Tests Validation Failures External IDs Used In Table Pairs Views Used in Table Pairs/Tables Sources Used In Table Pairs/Tables/Views Report Level User, Folder, Table Pair/Single Table User, Folder, Table Pair/Single Table User, Folder, Table Pair/Single Table Table Pair/Single Table User, Folder, Table Pair/Single Table User, Folder, Table Pair/Single Table User, Folder, Table Pair/Single Table User, Folder, Table Pair/Single Table User, Folder, Table Pair/Single Table Folder, Table Pair/Single Table User, Folder Folder Folder User, Folder User, Folder, Table Pair/Single Table Views PowerCenter Repository

You can export Jasper reports in the following formats:


PDF DOC XLS XLSX CSV ODS ODT

96

Chapter 14: Reports

Status in Jasper Reports


Reports display the status of a test, table pair, or single table based on the report type. The following status are available in Jasper reports for tests:
Pass. The test has passed. Fail. The test has failed. Error. The test encountered a run error or the test has no result.

The following status are available in Jasper reports for table pairs and single tables:
Pass. Pass status for a table pair or single table can occur in the following scenarios: - All the tests have passed. - If there is at least one test with pass status and rest of the tests with no results. Fail. If one of the tests in a table pair or single table fail, reports display the status as fail. Error. If all the tests in a table pair or single table has an error or no result, reports display the status as error.

Tests and table pairs or single tables with error status do not appear in the bar charts.

Configuring Jaspersoft Reporting


Configure the Jasper Reporting Service settings before you generate a report. 1. Click File > Settings > Preferences. The Preferences dialog box appears. 2. 3. 4. 5. 6. Click Jaspersoft Reports. Select Enable Jaspersoft Reporting. Enter the JasperReports Server host name. Enter the JasperReports Server port. Enter the Jaspersoft web app name. If you want to use a standalone JasperReports Server, enter the Jaspersoft web app name based on the JasperReports Server configuration. If you want to use the JasperReports Server available with Informatica Services, enter the Jaspersoft web app name as ReportingandDashboardsService. 7. 8. 9. Enter the folder in the JasperReports Server to store your reports. Click Test to validate the settings. Click Configure. Data Validation Option copies the jrxml files from <Data Validation Option installation directory>\DVO \jasper_reports folder to the JasperReports Server. 10. Enter the login details for the JasperReports Server if you use a standalone JasperReports Server or the login details of the Informatica Administrator if you use the JasperReports Server available with Informatica services. The user must have administrative privileges in the JasperReports Server. 11. Click OK. Data Validation Option creates the root folder in the Jaspersoft server. 12. Click Save to save the settings.

Note: If you change the Data Validation Option repository after you configure Jaspersoft Reporting, click Configure before you generate new reports.

Jasper Reports

97

Generating a Report
You can generate a report at the table pair, folder, repository, or user level. 1. 2. Select the object for which you want to generate a report. You can select multiple of objects of the same type. Click Action > Generate Report. The Report Parameters dialog box appears. You can also right-click the object and select Generate Report if you want to generate a report at the table pair or folder level. 3. Select the report type. The following table displays the report options that you can select for each report:
Report Name Run Summary User All Users/Current User/ Any User All Users/Current User/ Any User All Users/Current User/ Any User N/A Recency All/Latest Run Date Last 24 hours/Last 30 days/ Custom date range Last 24 hours/Last 30 days/ Custom date range Last 24 hours/Last 30 days/ Custom date range Last 24 hours/Last 30 days/ Custom date range Last 24 hours/Last 30 days/ Custom date range Last 24 hours/Last 30 days/ Custom date range Last 24 hours/Last 30 days/ Custom date range Last 24 hours/Last 30 days/ Custom date range Last 24 hours/Last 30 days/ Custom date range Last 24 hours/Last 30 days/ Custom date range N/A

Table Pair Summary

All/Latest

Detailed Test Results

All/Latest

Table Pair Run Summary

N/A

Last Run Summary

All Users/Current User/ Any User All Users/Current User/ Any User All Users/Current User/ Any User All Users/Current User/ Any User All Users/Current User/ Any User N/A

N/A

Percentage of Bad Rows

All/Latest

Percentage of Tests Passed

All/Latest

Tests Run Vs Tests Passed

All/Latest

Total Rows Vs Percentage of Bad Rows Bad Rows

All/Latest

All/Latest

Most Recent Failed Runs

All Users/Current User/ Any User N/A

All/Latest

Failed Runs

N/A

Last 24 hours/Last 30 days/ Custom date range Last 24 hours/Last 30 days/ Custom date range

Failed Tests

N/A

All/Latest

98

Chapter 14: Reports

Report Name Validation Failures

User All Users/Current User/ Any User N/A

Recency All/Latest

Run Date Last 24 hours/Last 30 days/ Custom date range N/A

Table pairs/Tables with External ID

N/A

Note: You can select the user only if you generate the report from the user level. 4. Optionally, enter the report subtitle. You can use the report subtitle to identify a specific report. 5. Click Run. The report appears in the browser.

Jasper Report Types


If you have the enterprise license, you can generate different Jasper reports in Data Validation Option. You can generate the following Jasper reports in Data Validation Option: Summary of Tests Run Summary of Testing Activities report displays the number of table pairs or single tables, the number of tests for each table pair or single table, and the overall test results. Table Pair/Table Summary Lists each table pair or single table with the associated tests. Data Validation Option displays each table pair or single table on a separate page. The report includes a brief description of each test and result. Detailed Test Result Displays each test on a separate page with a detailed description of the test definition and results. If one of the test sources is a SQL, join, or a lookup view, the report also displays the view definition. The following information is available in the Detailed Test Result report:
Test details Table pair/single table details Runtime information Bad record details

Table Pair/Table Run Summary Displays the summary of all the tests run for a table pair or single table for a given time period. You can click on the date in the report to view the Detailed Test Results report. Last Run Summary Displays the details of the last run of tests at the table pair or single table, folder, or user level. Reports lists the last test runs for the objects in the given time period. If you generate the report at the user level, the report displays the folder summary. You can click on the folder to view the Last Run Summary report for that folder.

Jasper Report Types

99

If you generate the report at the folder level or table pair/single table level, you can view the last run for all the table pairs or single tables for the given time period. You can click on a table pair or single table to view the Detailed Test Results report. Percentage of Bad Rows Displays the aggregate percentage of bad records in comparison with all the rows processed over a period of time. You can generate the report at the table pair or single table, folder, or user level. You can click on a bar for a particular date on the graph to view the Bad Rows report. The bars on the graph appear for the days on which you ran the tests. Percentage of Tests Passed Displays the percentage of passed tests in comparison with total number of tests over a period of time. You can generate the report at the table pair or single table, folder, or user level. The bars on the graph appear for the days on which you ran the tests. Tests Run Vs Tests Passed Displays the total number of tests run over a period of time as a bar chart. The number of tests passed is plotted across the bar chart. You can generate the report at the table pair or single table, folder, or user level. You can click on the test passed points on the graph to view the Validation Failure by Day report. The bars on the graph appear for the days on which you ran the tests. Total Rows Vs Percentage of Bad Records Displays the total number of rows tested over a period of time as a bar chart. The percentage of bad records is plotted across the bar chart. You can generate the report at the table pair or single table, folder, or user level. You can click on the percentage of bad records points on the graph to view the Bad Rows report. The bars on the graph appear for the days on which you ran the tests. Bad Rows Displays the number of bad records across test runs over a period of time in the form of a bar chart. You can click on a bar to view the Table Pair/Table Run Summary report. You can generate the Bad Rows report at the folder level or the table pair/single table level. Most Recent Failed Runs Displays the top ten most recent failed runs. You can run the Most Recent Failed Runs report at the folder or user level. If you run the report at the user level, the report displays the top ten failed runs across all the folders in the Data Validation Option repository in the given time period. If you click on a folder, you can view the Most Recent Failed Runs report for the folder. You can also click on the table pair or single table to view the Detailed Test Result report. If you run the report at the folder level, the report displays the top ten failed runs for that particular folder in the given time period. You can click on the table pair or single table to view the Detailed Test Result report. Note: Select the recency as latest if you want to get the most recent state of the table pair or single table. Failed Runs Displays the number of failed runs for each table pair or single table over a period of time in the form of a bar chart. Each bar represents the number of failed runs for a table pair or single table. You can click on a bar to view the Table Pair/Table Run Summary report. You can run the Failed Runs report at the folder level. Failed Tests Displays the number of failed tests across test runs over a period of time in the form of a bar chart. Each bar represents the number of failed tests for a table pair or single table. If you click the Table hyperlink, you can

100

Chapter 14: Reports

view the report in a tabular format. You can click on a bar or the table pair/single table name to view the Detailed Test Result report. You can run the Failed Tests report at the folder level. Validation Failure by Folder Displays the number of failed table pairs or single table as a bar chart for a given time period. Each bar represents the folder in which the failure occurred. If you click the Table hyperlink, you can view the report in a tabular format. You can click on a bar or the folder name to view the Failed Tests report available as bar chart and tabular format. You can run the Validation Failure by Folder report at the user or folder level. External IDs Used In Table Pairs Displays the list of table pairs or single tables with external IDs across all users. You can run the External IDs Used In Table Pairs report at the user, folder, or table pair level. Sources Used In Table Pairs/Tables/Views Displays the list of table pairs, single tables, and views in which a data source is used. Right-click on the PowerCenter repository, a repository folder, or a data source in the repository and select Get Source Usage In Table Pairs/Tables/Views to generate this report. You cannot generate a report at the repository folder level or repository level if there are more than 100 sources. Views Used In Table Pairs/Tables/Views Displays the list of table pairs and single tables in which a view is used. Right-click on the view and select Get View Usage In Table Pairs/Tables/Views to generate this report. Note: Data Validation Option reports might display the details of table pairs or single tables that you previously deleted from the Data Validation Option Client. If you generate a report after you modify the description of a table pair or a single table, the reports might display two entries for the object with the object ID appended to the new and old description with same object ID or in the annotations of the bar chart.

Dashboards
You can generate Data Validation Option dashboards to get an overview of the testing activities and test results. Dashboards display multiple reports in a single page. You need not generate individual reports to get an overview of test results across a fixed time period. You can view the following dashboards in Data Validation Option: Home Dashboard Displays the following reports for the past 30 days:
Tests Run Vs Tests Passed Total Rows vs Percentage of Bad Rows Percentage of Tests Passed Percentage of Bad Rows

You can click through the Total Rows vs Percentage of Bad Rows report and Percentage of Bad Rows reports to view the Bad Rows report. You can click through Tests Run Vs Tests Passed to view the Validation Failures report. Repository Dashboard Displays the Validation Failures for Folders report for the past 24 hours and 30 days in both graphical and tabular formats. The dashboard also displays the Most Recent Failed Runs report for the repository.

Dashboards

101

You can click through the Validation Failure by Folder report to view the Failed Tests report. You can click through the Most Recent Failed Runs report to view the Most Recent Failed Runs report for a folder and the Detailed Test Result report. Folder Dashboard Displays the Bad Rows report and Failed Tests report for the past 24 hours and 30 days in both graphical and tabular formats. You can click through the Failed Tests report to view the Table Pair/Table report and the Bad Rows report to view the Detailed Test Result report. Table Dashboard The Table dashboard displays the following reports for the past 30 days:
Tests Passed Vs Test Failed Bad Rows Table Pair/Table Run Summary

You can click through the Bad Rows report to view the contents to view the Detailed Test Result report.

Metadata Manager Integration


You can view the metadata of data sources if you configure Metadata Manager integration in Data Validation Option. You can analyze the impact of test results on data sources if you enable Metadata Manager integration. You can view the metadata of the data source in the PowerCenter repository. To view the metadata of data sources, you must setup a Metadata Manager Service in the Informatica domain. You must create a PowerCenter resource for the PowerCenter repository that contains the data source. Right-click on a data source in the repository and select Get Metadata to view the metadata of the data source.

Configuring Metadata Manager Integration


Configure Metadata Manager integration in Data Validation Option to view the data lineage of data sources. 1. 2. 3. 4. 5. 6. 7. 8. Right-click the repository for which you want to enable Metadata Manager Integration and select Edit Repository. Select Enable Metadata Manager Integration . Select whether the Metadata Manager Service runs on a secure connection. Enter the server host name. Enter the Metadata Manager Service port. Enter the name of the PowerCenter resource. Click Test to the test the settings. Click Save.

102

Chapter 14: Reports

CHAPTER 15

Command Line Integration


This chapter includes the following topics:
Command Line Integration Overview, 103 CopyFolder, 104 CreateUserConfig, 105 DisableInformaticaAuthentication, 105 ExportMetadata, 106 ImportMetadata, 107 InstallTests, 107 LinkDVOUsersToInformatica, 108 PurgeRuns, 109 RefreshRepository, 110 RunTests, 111 UpdateInformaticaAuthenticationConfiguration, 112 UpgradeRepository, 113

Command Line Integration Overview


You can invoke Data Validation Option capabilities at the command line. For example, you can create and run tests without using the Data Validation Option Client. Running tests at the command line allows you to schedule test execution. It also allows you to embed a specific test as part of the ETL workflow or as part of another process. For example, you can create an ETL process that moves data from source to staging, runs validation, and then moves data into the target or an error table based on the validation results. The command line utility writes regular messages to the STDOUT output stream. It writes error messages to the STDERR output stream. Normally, command utility messages appear on the screen. To capture messages to a file, use the redirection operator. In a Windows machine, the Data Validation Option command line utility is DVOCmd.exe. DVOCmd.exe exists in one of the following directories:
32-bit operating systems: C:\Program Files\Informatica<version>\DVO\ 64-bit operating systems: C:\Program Files (x86)\Informatica<version>\DVO\

Important: You must run DVOCmd from the Data Validation Option installation directory.

103

DVOCmd uses the following syntax:


DVOCmd Command [Argument] [--Options] [Arguments]

To get help on a command, enter the command at the prompt without any argument. For example: $HOME/DVO/
DVOCmd RunTests

To enable users to run tests from a UNIX machine, run the following command from DVOCmd installation directory:
$HOME/DVO/DVOCmd dos2unix DVOCmd

Note: In the syntax descriptions, options and arguments enclosed in square brackets are optional.

CopyFolder
Copies the contents of a folder in a user workspace to a different folder in the same workspace or to another user workspace. The target folder must be a new folder. The target folder cannot exist in the target workspace. The CopyFolder command copies the table pairs, single tables, and test cases that exist within the source folder. It does not copy test runs or the external IDs associated with table pairs or single tables. If the table pairs or single tables in the source folder use an SQL or lookup view, the CopyFolder command copies the SQL or lookup view to the target user workspace unless the workspace contains a view with the same name. Before Data Validation Option copies a folder, it verifies that all data sources associated with the objects being copied exist in the target workspace. If any data source is missing, Data Validation Option does not copy the folder. The CopyFolder command uses the following syntax:
DVOCmd CopyFolder [--confdir conf_dir] --fromUser source_user --fromFolder source_folder --toUser target_user [--toFolder target_folder] [--reuseViews Yes] [--username Username] [--password Password]

The following table describes CopyFolder options and arguments:


Option --confdir Argument conf_dir Description The user configuration directory. Specify the configuration directory if you have multiple Data Validation Option repositories on a client machine. If you have one Data Validation Option repository on a client machine and have not changed the configuration directory, you do not need to specify this option. Because Windows directories often contain spaces, you must enclose the file path in quotes. Name of the source user. Data Validation Option copies the folder in this user workspace. Name of the source folder. Data Validation Option copies this folder. Name of the target user. Data Validation Option copies the folder to this user workspace. The source user and the target user can be the same user. Name of the target folder. The target folder must be unique in the target workspace. If you do not specify a target folder, Data Validation Option creates a folder in the target workspace with the same name as the source folder. Reuses an SQL or lookup view in the target workspace when the workspace contains a view with the same name as a source SQL or lookup view. If you specify this option, Data Validation Option does not copy the source view to the target workspace. If you do not

--fromUser --fromFolder --toUser

source_user source_folder target_user

--toFolder

target_folder

--reuseViews

Yes

104

Chapter 15: Command Line Integration

Option

Argument

Description specify this option, Data Validation Option prompts you for the action to take when views with the same name are found.

--username

User name

Informatica domain user name for the domain to which you configured Informatica authentication. Required if you configure Informatica Authentication.

--password

Password

Password for the Informatica user name. Required if you configure Informatica Authentication.

CreateUserConfig
Creates Data Validation Option users with the specified user names. The CreateUserConfig command uses the following syntax:
DVOCmd CreateUserConfig user_name [, user_name, ] [--confdir conf_dir] --outputdir output_dir [--overwrite]

Creates a preferences file called <username>-preferences.xml for each user in the output directory. The preferences file contains connection information for the Data Validation Option repository. Copy each preferences file from the output directory to the user configuration directory and rename it to preferences.xml. This allows each user to access the Data Validation Option repository. The following table describes CreateUserConfig options and arguments:
Option n/a Argument user_name Description The name of the Data Validation Option user. To create multiple users, enter multiple user names separated by commas. The user configuration directory. Specify the configuration directory if you have multiple Data Validation Option repositories. If you have one Data Validation Option repository on a client machine and have not changed the configuration directory, you do not need to specify this option. Enclose the file path in quotes. Directory in which to store user preferences files. Enclose the file path in quotes. Use double-quotes if the path has space or special characters. --overwrite n/a Overwrites the configuration files.

--confdir

conf_dir

--outputdir

output_dir

RELATED TOPICS:
Data Validation Option Configuration for Additional Users on page 26

DisableInformaticaAuthentication
Disables Informatica authentication in a Data Validation Option schema.

CreateUserConfig

105

The DisableInformaticaAuthentication command uses the following syntax:


DVOCmd DisableInformaticaAuthentication [--confdir conf_dir] [--username User name] [--password Password]

The following table describes DisableInformaticaAuthentication options and arguments:


Option --confdir Argument conf_dir Description The user configuration directory. Specify the configuration directory if you have multiple Data Validation Option repositories on a client machine. If you have one Data Validation Option repository on a client machine and have not changed the configuration directory, you do not need to specify this option. Enclose the file path in quotes. --username User name Informatica domain administrator user name for the domain to which you configured Informatica authentication. Password for the Informatica user name.

--password

Password

ExportMetadata
Exports all Data Validation Option metadata to an XML file. The ExportMetadata command uses the following syntax:
DVOCmd ExportMetadata file_name [--confdir conf_dir] [--username User name] [--password Password]

The following table describes ExportMetadata options and arguments:


Option n/a --confdir Argument file_name conf_dir Description The file to which you want to export metadata. The user configuration directory. Specify the configuration directory if you have multiple Data Validation Option repositories on a client machine. If you have one Data Validation Option repository on a client machine and have not changed the configuration directory, you do not need to specify this option. Because Windows directories often contain spaces, you must enclose the file path in quotes. --username User name Informatica domain user name for the domain to which you configured Informatica authentication. Required if you configure Informatica Authentication. --password Password Password for the Informatica user name. Required if you configure Informatica Authentication.

106

Chapter 15: Command Line Integration

RELATED TOPICS:
Metadata Export and Import

ImportMetadata
Imports metadata into Data Validation Option from an export XML file. The ImportMetadata command uses the following syntax:
DVOCmd ImportMetadata file_name [--confdir conf_dir] [--overwrite] [--username Username] [--password Password]

The following table describes ImportMetadata options and arguments:


Option n/a --confdir Argument file_name conf_dir Description The file that contains metadata to be imported. The user configuration directory. Specify the configuration directory if you have multiple Data Validation Option repositories on a client machine. If you have one Data Validation Option repository on a client machine and have not changed the configuration directory, you do not need to specify this option. Because Windows directories often contain spaces, you must enclose the file path in quotes. --overwrite --username n/a User name Overwrites existing objects. Informatica domain user name for the domain to which you configured Informatica authentication. Required if you configure Informatica Authentication. --password Password Password for the Informatica user name. Required if you configure Informatica Authentication.

RELATED TOPICS:
Metadata Export and Import

InstallTests
Prepares all tests for a single table or table pair. For each test, this command generates the PowerCenter mapping in the Data Validation Option target folder in the PowerCenter repository. The InstallTests command uses the following syntax:
DVOCmd InstallTests external_ID [, external_ID, ] [--confdir conf_dir] [--cacheSize CACHESIZE] [--username User name] [--password Password]

ImportMetadata

107

The following table describes InstallTests options and arguments:


Option n/a --confdir Argument external_ID conf_dir Description The external ID for the single table or table pair. The user configuration directory. Specify the configuration directory if you have multiple Data Validation Option repositories on a client machine. If you have one Data Validation Option repository on a client machine and have not changed the configuration directory, you do not need to specify this option. Enclose the file path in quotes. --cacheSize CACHESIZ E Memory allocation to generate transformations in PowerCenter mappings. Increase the cache size for tests that contain multiple joins and lookups. Default is 20 MB for the data cache and 10 MB for the index cache. Specify "Auto" to enable PowerCenter to compute the cache size. --username User name Informatica domain administrator user name for the domain to which you configured Informatica authentication. Required if you configure Informatica authentication. --password Password Password for the Informatica user name. Required if you configure Informatica authentication. -forceInstall n/a Creates new mappings for table pairs in the repository. DVOCmd uses the DTM buffer size value configured in preferences.xml. If you modify the DTM buffer size value in the preferences file, run the InstallTests command with the forceInstall option for existing table pairs before you run the RunTests command.

Cache Settings
You can configure the cache settings for the PowerCenter transformations that Data Validation Option generates for the tests. Data Validation Option generates a PowerCenter mapping with Joiner transformations and Lookup transformations for a test that contains joins and lookups. Joiner transformations and Lookup transformations require a large amount of memory. Configure the cache settings for a complex test that contains joins and lookups. The value that you specify as cache settings for a test persists until you modify the test. For information about optimal cache setting for the tests, see the PowerCenter Advanced Workflow Guide .

LinkDVOUsersToInformatica
Links the existing Data Validation Option users with Informatica domain users. Create a text file that contains a list of the Data Validation Option users and Informatica domain users in the following format:
<dvo_user_name1>,<Informatica_user_name1> <dvo_user_name2>,<Informatica_user_name2> . . <dvo_user_nameN>,<Informatica_user_nameN>

108

Chapter 15: Command Line Integration

The LinkDVOUsersToInformatica command uses the following syntax:


DVOCmd LinkDVOUsersToInformatica [file_name]

The following table describes LinkDVOUsersToInformatica argument:


Option n/a Argument file_name Description Name of the file that contains the mapping between Data Validation Option users and Informatica users. Enclose the file path in quotes.

PurgeRuns
Purges test runs from the Data Validation Option repository. You can purge deleted test runs or purge test runs by date. When you purge test runs by date, you can purge all test runs that occur on or after a specified date, before a specified date, or between two dates. The PurgeRuns command uses the following syntax:
DVOCmd PurgeRuns [--confdir conf_dir] [--deleted] [--fromdate from_date] [--todate to_date] [--username User name] [--password Password]

If you configure Informatica authentication for the Data Validation Option schema, enter --username and -password. The following table describes PurgeRuns options and arguments:
Option --confdir Argument conf_dir Description The user configuration directory. Specify the configuration directory if you have multiple Data Validation Option repositories on a client machine. If you have one Data Validation Option repository on a client machine and have not changed the configuration directory, you do not need to specify this option. Because Windows directories often contain spaces, you must enclose the file path in quotes. --deleted --fromdate --todate --username n/a from_date to_date User name Purges deleted test runs. Purges test runs that occur on or after this date. Purges test runs that occur before this date. Informatica domain user name for the domain to which you configured Informatica authentication. Required if you configure Informatica authentication. --password Password Password for the Informatica user name. Required if you configure Informatica authentication.

PurgeRuns

109

RefreshRepository
Refreshes a source or target repository. The RefreshRepository command uses the following syntax:
DVOCmd RefreshRepository repo_name [--confdir conf_dir] [--all] [--connections] [--folderlist] [-allfolders] [--folder folder_name] [--username User name] [--password Password] [--dryrun]

Tip: Use the --folderlist and --folder options to get the sources and targets in a new PowerCenter repository folder. For example, if the repository name is "DVTgtRepo" and the new folder name is "NewOrders," enter the following command:
DVOCmd RefreshRepository DVTgtRepo --folderlist --folder NewOrders

Important: The RefereshRepository command fails to run from the UNIX command line. If you want to run RefreshRepository command from the command line, use a Windows machine. The following table describes RefreshRepository options and arguments:
Option n/a Argument repo_name Description Name of the repository you want to refresh. Note: This can take several minutes to several hours depending on the size of the repository. The user configuration directory. Specify the configuration directory if you have multiple Data Validation Option repositories on a client machine. If you have one Data Validation Option repository on a client machine and have not changed the configuration directory, you do not need to specify this option. Use double-quotes if the path has space or special characters. --all n/a Refreshes all source, target, folder, and connection metadata for the repository. Note: This option can take several minutes to several hours depending of the sources and targets in the PowerCenter Repository Service, but Data Validation Option does not copy the objects into the Data Validation Option repository. You can use the option to checks whether the import from PowerCenter Repository Services works. ng on the size of the repository. Refreshes connection metadata for the target repository. Refreshes the folder list for the repository. Refreshes source and target metadata in all folders in the repository. Refreshes source and target metadata for the named folder. Informatica domain user name for the domain to which you configured Informatica authentication. Required if you configure Informatica authentication. --password Password Password for the Informatica user name. Required if you configure Informatica authentication. --dryrun n/a Checks whether import from the PowerCenter Repository Services works. RefreshRepository reads the sources and targets in the PowerCenter Repository Service, but does not write the objects into the Data Validation Option repository.

--confdir

conf_dir

--connections --folderlist --allfolders --folder --username

n/a n/a n/a folder_name User name

110

Chapter 15: Command Line Integration

RELATED TOPICS:
Repositories Overview on page 36

RunTests
Runs all tests for a single table or table pair. For example, to run tests for a table pair with the external ID "abc123," you might enter the following command:
DVOCmd RunTests abc123

If one test fails, the overall result also fails. The exit status code for a successful test is 0. A non-zero status code designates a failed test or an error. The following table describes the status codes:
Status Code 0 1 2 3 4 Description

The test is successful. The test fails to install. The test fails to run. The test does not give any result. The test fails.

The RunTests command uses the following syntax:


DVOCmd RunTests external_ID [, external_ID, ] [--confdir conf_dir] [--email email_ID,...] [--sendEmail NotPass] [--cacheSize CACHESIZE] [--username User name] [--password Password]

The following table describes RunTests options and arguments:


Option n/a --confdir Argument external_ID conf_dir Description The external ID for the single table or table pair. The user configuration directory. Specify the configuration directory if you have multiple Data Validation Option repositories on a client machine. If you have one Data Validation Option repository on a client machine and have not changed the configuration directory, you do not need to specify this option. Enclose the file path in quotes. --email email_ID The email address to which Data Validation Option sends an email when the tests are complete. You can provide multiple email addresses separated by commas. The email specifies whether the test has passed or failed and provides a link to the test results. Note: Configure the SMTP settings for the outgoing email server on the PowerCenter Integration Service with the following custom properties: SMTPServerAddress, SMTPPortNumber, SMTPFromAddress, and SMTPServerTimeout. If you want to use Microsoft Outlook to send email, enter the Microsoft Exchange profile in the MSExchangeProfile configuration property in the PowerCenter Integration Service.

RunTests

111

Option --sendEmail --cacheSize

Argument NotPass CACHESIZE

Description Limits Data Validation Option to sending an email only if the test fails. Memory allocation to generate transformations in PowerCenter mappings. Increase the cache size for tests that contain multiple joins and lookups. Default is 20 MB for the data cache and 10 MB for the index cache. Specify "Auto" to enable PowerCenter to compute the cache size. When you run the RunTests command and set the cache size, Data Validation Option installs the tests in the repository before it runs the test.

--username

User name

Informatica domain user name for the domain to which you configured Informatica authentication. Required if you configure Informatica authentication.

--password

Password

Password for the Informatica user name. Required if you configure Informatica authentication.

Cache Settings
You can configure the cache settings for the PowerCenter transformations that Data Validation Option generates for the tests. Data Validation Option generates a PowerCenter mapping with Joiner transformations and Lookup transformations for a test that contains joins and lookups. Joiner transformations and Lookup transformations require a large amount of memory. Configure the cache settings for a complex test that contains joins and lookups. The value that you specify as cache settings for a test persists until you modify the test. For information about optimal cache setting for the tests, see the PowerCenter Advanced Workflow Guide .

UpdateInformaticaAuthenticationConfiguration
Updates the Informatica authentication properties in the Data Validation Option schema. The UpdateInformaticaAuthenticationConfiguration command uses the following syntax:
DVOCmd UpdateInformaticaAuthenticationConfiguration [--confdir conf_dir] [--infahostname INFAHOSTNAME] [--infahttpport INFAHTTPPORT] [--isSecureConnection ISSECURECONNECTION] [--infaAdminUserName INFAADMINUSERNAME] [--infaAdminPassword INFAADMINPASSWORD]

You can obtain these parameters from the nodemeta.xml file available in the following location:
<InformaticaInstallationDir>/server/isp

The following table describes UpdateInformaticaAuthenticationConfiguration options and arguments:


Option --confdir Argument conf_dir Description The user configuration directory. Specify the configuration directory if you have multiple Data Validation Option repositories on a client machine. If you have one

112

Chapter 15: Command Line Integration

Option

Argument

Description Data Validation Option repository on a client machine and have not changed the configuration directory, you do not need to specify this option. Because Windows directories often contain spaces, you must enclose the file path in quotes.

--infahostname --infahttpport

INFAHOSTNAME INFAHTTPPORT

Host name of the Informatica gateway. Port number to connect to the Informatica gateway. Enter the value of the port element in the nodemeta.xml file. Set the argument as true if TLS is enabled in the Informatica gateway.

-isSecureConnectio n -infaAdminUserNam e -infaAdminPasswor d

ISSECURECONNE CTION

INFAADMINUSER NAME

Informatica administrator user for the Informatica gateway.

INFAADMINPASS WORD

Password for the Informatica administrator user.

UpgradeRepository
Upgrades the Data Validation Option repository. Use this command when you upgrade from a previous version of Data Validation Option. The UpgradeRepository command uses the following syntax:
DVOCmd UpgradeRepository [--username User name] [--password Password]

The following table describesUpgradeRepository options and arguments:


Option --username Argument User name Description Informatica domain user name for the domain to which you configured Informatica authentication. Required if you configure Informatica Authentication. --password Password Password for the Informatica user name. Required if you configure Informatica Authentication.

RELATED TOPICS:
Data Validation Option Installation and Configuration on page 20

UpgradeRepository

113

CHAPTER 16

Troubleshooting
This chapter includes the following topics:
Troubleshooting Overview, 114 Troubleshooting Initial Errors, 114 Troubleshooting Ongoing Errors, 115 Troubleshooting Command Line Errors, 116

Troubleshooting Overview
When you run a test, Data Validation Option performs the following tasks: 1. 2. 3. Creates a mapping in the specified PowerCenter folder. Creates the PowerCenter workflow. Runs the PowerCenter session.

Besides initial installation problems, Data Validation Option errors can occur in one of the following steps: Installation Error Data Validation Option cannot create a PowerCenter mapping. Run Error Data Validation Option cannot create or run a PowerCenter workflow. No Results The PowerCenter session runs but fails, or there are no results in the results database.

Troubleshooting Initial Errors


This section assumes that Data Validation Option has just been installed and no successful tests have been executed. It also assumes that the first test is a simple test that does not contain expressions or SQL views.

114

The following table describes common initial errors:


Error Cannot connect to the Data Validation Option repository Cannot read the PowerCenter repository Possible Cause and Solution Database credentials are incorrect. Check the server, port, and database name specified in the URL line. If the problem persists, contact your database administrator. If the problem is database username and password, the error message explicitly states this. Check the repository settings up to Security Domain. (Informatica Domain names are not used until later.) If you cannot resolve the error, contact the Informatica administrator. Another way to troubleshoot this error is by trying to log into the repository through the pmrep command line utility. If you get an out-of-memory error when you access large repositories, increase the Java heap size of the Data Validation Option client from the command line. You can increase the Java heap size from the command line with the following command:
DVOClient.exe -J-Xmx<heapsize value>

Default is 1024 MB. Installation Error Verify that the Data Validation Option folder is closed in the Designer, Workflow Manager and the Repository Manager. The Workflow Monitor can be open. Verify that the INFA_HOME environment variable is set. Verify that the Data Validation Option folder exists. Run Error Verify that the Data Validation Option folder is closed. Check the Informatica Domain name and Integration Service names. Verify that they are running. Verify that the PowerCenter connection name (connection to the Data Validation Option repository) is correct, and that the user has the privilege to use it. Open the session log and look for session errors. Most session failures are caused by an incorrect connection. If the error is Cannot get object class for dvo/infact/PivotPluginImpl, the dvoct.jar file cannot be read either because it is not on the server, because of its privileges, or because the information entered in the Administrator tool is incorrect. Verify that the user has the privilege to use the connection to the Data Validation Option repository specified in the Tools > Preferences > Data Validation Option database. This will also be apparent in the session log. If you install PowerCenter 9.0.1 or earlier and the data source is SAP, install ABAP program on the mapping generated by the test in the Designer tool. No Results Verify that there is data in the data set you are analyzing. Tables should have records, and filters and joins should not result in an empty set. Verify that the connection to the Data Validation Option repository specified in the Workflow Manager points to the Data Validation Option repository.

Troubleshooting Ongoing Errors


This section assumes that successful tests have been created and run before the error occurred. In general, you should always check for the following sources of errors:
Incorrect connection The Data Validation Option folder is open in the Designer, Workflow Manager, or Repository Manager

Troubleshooting Ongoing Errors

115

The following table describes common ongoing errors:


Error Installation or run errors Possible Cause and Solution Verify that the Data Validation Option folder is closed in the Designer, Workflow Manager and Repository Manager. Verify that the PowerCenter environment is functioning correctly, for example, services are running and the repository is up. If the error occurred right after you created an expression either in a test editor dialog box or as a WHERE clause, check the syntax. Open the session log and verify that the PowerCenter connections are correct. No results Verify that there is data in the data set you are analyzing. Tables should have records. Filters or joins should not result in an empty set. Verify that you have read and write permissions on the Data Validation Option installation directory and subdirectories. Verify that the repository, data sources, and folders that contain the data sources have identical names in the source and the target workspaces. Object names in Data Validation Option are case sensitive. Verify that all data sources associated with the objects to copy exist in the target workspace in the same location and that the names match.

Inability to generate reports

Inability to copy folders

Troubleshooting Command Line Errors


I ran a DVOCmd command that got an error and used the redirection operator to write the messages to a file. The redirection operator does not redirect all messages to the output file. When you run a DVOCmd command, the command line utility writes regular messages to the STDOUT output stream and writes error messages to the STDERR output stream. You use the redirection operator to write messages to an output file. To merge messages from both output streams, enter "2>&1" after the output file name. For example, you encounter an error while refreshing folder "MyFolder" in Data Validation Option repository "DVORepo." To write all messages, including the error messages, to a text file called "Log.txt," use the following command:
DVOCmd RefreshRepository DVORepo --folder MyFolder > C:\Log.txt 2>&1

To append messages to an existing log file called "DVOLog.txt," use the following command:
DVOCmd RefreshRepository DVORepo --folder MyFolder >> C:\DVOLog.txt 2>&1

116

Chapter 16: Troubleshooting

APPENDIX A

Datatype Reference
This appendix includes the following topics:
Test, Operator, and Datatypes Matrix for Table Pair Tests, 117 Test, Operator, and Datatypes Matrix for Single-Table Constraints, 118

Test, Operator, and Datatypes Matrix for Table Pair Tests


Table pair tests use the following datatypes:
s = string datatypes n = numeric datatypes d = date/time datatypes b = binary/other datatypes

The following table describes the operators and datatypes for table pair tests:
Operators Allowed Datatypes Allowed: Approx. Operator s,n,d,b s,n,d s,n,d,b n n n n s,n,d s,n,d Datatypes Allowed: All Other s,n,d,b s,n,d s,n,d,b s,n,d s,n,d n n s,n,d s,n,d

COUNT COUNT_DISTINCT COUNT_ROWS MIN MAX AVG SUM SET_AinB SET_BinA

All All All All All All All ---

117

Operators Allowed

Datatypes Allowed: Approx. Operator s,n,d n n

Datatypes Allowed: All Other s,n,d s,n,d s,n,d

SET_AeqB VALUE OUTER_VALUE

-All All

Note: SET tests do not use operators and allow string, numeric and date/time datatypes.

Test, Operator, and Datatypes Matrix for Single-Table Constraints


Single-table constraints use the following datatypes:
s = string datatypes n = numeric datatypes d = date/time datatypes b = binary/other datatypes

The following table describes the operators and datatypes for single-table constraints:
Operators Allowed Datatypes Allowed Result Expression Datatype n n n n n n n s,n,d s ----

COUNT COUNT_DISTINCT COUNT_ROWS MIN MAX AVG SUM VALUE FORMAT UNIQUE NOT_NULL NOT_BLANK

All All All All All All All All =, <> ----

s,n,d,b s,n,d s,n,d,b s,n,d s,n,d n n s,n,d s,n,d s,n,d s,n,d s,n,d

118

Appendix A: Datatype Reference

APPENDIX B

BIRT Report Examples


This appendix includes the following topics:
Summary of Testing Activities, 119 Table Pair Summary, 120 Detailed Test Results Test Page, 122 Detailed Test Results Bad Records Page, 123

Summary of Testing Activities


The following figure shows an example of a Summary of Testing Activities report:

119

Table Pair Summary


The following figure shows an example of a Table Pair Summary report:

120

Appendix B: BIRT Report Examples

Table Pair Summary

121

Detailed Test Results Test Page


The following figure shows an example of a test page in the Detailed Test Results report:

122

Appendix B: BIRT Report Examples

Detailed Test Results Bad Records Page


The following figure shows an example of a bad records page in the Detailed Test Results report:

Detailed Test Results Bad Records Page

123

APPENDIX C

Jasper Report Examples


This appendix includes the following topics:
Home Dashboard, 124 Repository Dashboard, 125 Folder Dashboard, 126 Tests Run Vs Tests Passed, 126 Total Rows Vs Percentage of Bad Records , 127 Most Recent Failed Runs, 127 Last Run Summary , 128

Home Dashboard
Home dashboard displays the test details in the Data Validation Schema over the past 30 days.

124

Repository Dashboard
Repository Dashboard displays the details of tests run in the Data Validation Option repository over the past 30 days and the past 24 hours.

Repository Dashboard

125

Folder Dashboard
Folder Dashboard displays the details of tests run in the repository over the past 30 days and the past 24 hours.

Tests Run Vs Tests Passed


The following figure shows an example of a Tests Run Vs Tests Passed report:

126

Appendix C: Jasper Report Examples

Total Rows Vs Percentage of Bad Records


The following figure shows an example of a Total Rows Vs Percentage of Bad Records report:

Most Recent Failed Runs


The following figure shows an example of a Most Recent Failed Runs report:

Total Rows Vs Percentage of Bad Records

127

Last Run Summary


The following figure shows an example of a Last Run Summary report:

128

Appendix C: Jasper Report Examples

APPENDIX D

Reporting Views
This appendix includes the following topics:
Reporting Views Overview, 129 results_summary_view, 129 rs_bad_records_view, 132 results_id_view, 133 meta_sv_view, 133 meta_lv_view, 134 meta_jv_view, 135 meta_ds_view, 136 meta_tp_view, 136 rs_sv_id_view, rs_lv_id_view, and rs_jv_id_view, 137

Reporting Views Overview


All Data Validation Option reports run against database views that are set up as part of the installation process. You can write custom reports against these views. Do not write reports against the underlying database tables because the Data Validation Option repository metadata can change between versions.

results_summary_view
This view combines all table pair and test metadata and all test results. This view consists of the following general sections:
tp_. Table pair metadata tc_. Test metadata tc_rs_. Test results tr_, ti_. Table pair runtime information

129

The following table describes table pair information:


Metadata tp_user_id tp_username tp_obj_id tp_version tp_name tp_time_stamp tp_comments tp_description tp_table_a, tp_table_b tp_table_version_a tp_table_version_b tp_type_a, tp_type_b tp_conn_name_a tp_conn_name_b tp_owner_name_a, tp_owner_name_b tp_src_dir_a, tp_src_file_a, tp_src_dir_b, tp_src_file_b tp_in_db_a, tp_in_db_b tp_where_clause_a, tp_where_clause_b tp_is_where_clause_dsq_a, tp_is_where_clause_dsq_b tp_join_list_str tp_external_id Joined fields as one string Table pair external ID Run WHERE clause in database for A or B A and B WHERE clauses Run aggregations in database for A or B A and B directory and file names if flat files A and B owner names A and B connection names A and B source type: 1 = relational, 2 = flat file, 3 = SQL view, 4 = lookup view Version name for SQL and lookup views, otherwise empty Description User ID of the person who ran this test User name of the person who ran this test Unique table pair ID Table pair version Table pair description Time table pair was last edited Table pair comments Table pair description. Same as tp_name. Either full table name, including the PowerCenter repository directory or view name

130

Appendix D: Reporting Views

The following table describes test information:


Metadata tc_index tc_description tc_comment tc_type tc_agg_func tc_column_a tc_operator tc_column_b tc_tables_type tc_threshold tc_max_bad_records tc_is_case_insensitive tc_is_treat_nulls_equal tc_is_trim_right_ws tc_expression_a tc_expression_b Description Internal ID Test description Test comment AGG if aggregate, otherwise equal to test (VALUE, OUTER_VALUE, etc.) Aggregate test if AGG (COUNT, SUM, etc.), otherwise blank Field A Operator Field B 0 = two-table pair, 1 = one-table constraint Threshold Maximum bad records Case insensitive checkbox: 0 = false, 1 = true Null = Null, 0 = false, 1 = true Trim trailing spaces: 0 = false, 1 = true Expression A Expression B

The following table describes test results information:


Metadata tc_rs_result tc_rs_failure_count tc_rs_processed_count tc_rs_count_rows_a tc_rs_count_rows_b tc_rs_agg_value_a, tc_rs_agg_value_b Aggregate results A and B Description Test Result: 1 = pass, 0 = fail: -1 = no_results, -2 = error Number of bad records Number of records processed. Number of records in A and B.

results_summary_view

131

The following table describes table pair runtime information:


Metadata tr_id tr_state tr_start_time tr_finish_time tr_is_latest tr_error_msg ti_id ti_folder_name ti_mapping_name ti_session_name ti_workflow_name Description Unique internal run ID incremented for each run Result state. 2 = install_error, 4 = run_success, 5 = run_error Table pair run time start, in milliseconds, since 1970 UTC Run time finish Whether this is the latest run of a given table pair: 1 = latest; 0 = not latest Error message for a run error. Internal ID. Do not use. PowerCenter folder name where mapping was created PowerCenter mapping name PowerCenter session name PowerCenter workflow name

rs_bad_records_view
This view is a detailed view of all bad records. It can be joined with results_summary_view on tr_id and tc_index. The following table describes bad records information:
Metadata tr_id tc_index tc_rs_br_key_a tc_rs_br_value_a tc_rs_br_key_b tc_rs_br_value_b Description Run ID, joined to tr_id in results_summary_view Joined to tc_index in results_summary_view Key A Value A Key B Value B

132

Appendix D: Reporting Views

results_id_view
This view uses results_summary_view as the source and aggregates it on the table pair level. This view contains the table pair result. The following table describes results ID information:
Metadata tr_id tr_is_latest tr_start_time ti_id tp_obj_id tp_version tp_user_id tp_rs_result Description Internal run ID incremented for each run Whether this is the latest run of a given table pair: 1 = latest; 0 = not latest Table pair run time start, in milliseconds, since 1970 UTC Internal ID Table pair ID Table pair version User ID of the person who ran this test Table pair result: 1 = pass, 0 = fail, -1 = no results, -2 = error

meta_sv_view
This view returns all SQL view information. The following table describes SQL view information:
Metadata sv_id sv_name sv_obj_id sv_version sv_description sv_comment sv_owner_name sv_conn_name sv_dsname Description Unique ID Internal name View object ID View version Description Comment Owner name Connection Table name

results_id_view

133

Metadata sv_sql_query svf_name svf_business_name svf_datatype svf_precision svf_scale svf_is_key

Description SQL statement Column name Not used Column datatype Column precision Column scale Not used

meta_lv_view
This view returns all lookup view information. The following table describes lookup view information:
Metadata lv_id lv_name lv_obj_id lv_version lv_tp_name lv_time_stamp lv_comments lv_description lv_table_a lv_table_b lv_type_a, lv_type_b lv_conn_name_a, lv_conn_name_b lv_owner_name_a, Source and lookup owner names Source and lookup connections Description Unique ID Internal name View object ID View version Table pair description Time lookup view was last edited Comments Lookup view description Source table name Lookup table name Source and lookup types: 1 = relational, 2 = flat file

134

Appendix D: Reporting Views

Metadata lv_owner_name_b lv_src_dir_a, lv_src_file_a, lv_src_dir_b, lv_src_file_b lv_where_clause_a, lv_where_clause_b lv_is_where_clause_dsq_a, lv_is_where_clause_dsq_b lv_join_list_str

Description

Source and lookup directory and file names if flat files

Not used

Not used

Lookup relationship (join) as a string description

meta_jv_view
This view returns all join view information. The following table describes join view information:
Metadata jv_id jv_name jv_obj_id jv_version jv_description jv_table_name jv_alias_name jv_table_join_type jv_table_position jv_is_live Description Unique ID Internal name View object ID View version Description Table name Alias name of the table Type of join Position of join in the table Whether the view is in use.

meta_jv_view

135

meta_ds_view
This view returns all the data sources used by table pairs, SQL views, lookup views, and join views. The following table describes SQL view information:
Metadata id object_name object_description object_folder_name object_id object_version object_type object_is_live Description Unique ID of the table pair, SQL view, lookup view, or join view. Name of the data source. Description of the data source. Folder in which the data source is available. Unique ID of the data source. Version of the data source in the PowerCenter repository. Type of the data source. Whether the table pair, SQL view, lookup view, or join view is available to the users. Table name of the data source. User who used the data source. ID of the user who used the data source.

object_table_name object_user_name object_user_id

meta_tp_view
This view returns all table pair information. The following table describes table pair information:
Metadata tp_user_id tp_username tp_obj_id tp_version tp_name tp_time_stamp tp_comments Description User ID User name Unique table pair ID Table pair version Table pair description Time table pair was last edited Table pair comments

136

Appendix D: Reporting Views

Metadata tp_description tp_table_a, tp_table_b tp_table_version_a tp_table_version_b tp_type_a, tp_type_b tp_conn_name_a tp_conn_name_b tp_owner_name_a, tp_owner_name_b tp_src_dir_a, tp_src_file_a, tp_src_dir_b, tp_src_file_b tp_in_db_a, tp_in_db_b tp_where_clause_a, tp_where_clause_b tp_is_where_clause_dsq_a, tp_is_where_clause_dsq_b tp_join_list_str tp_external_id tp_is_live

Description Table pair description. Same as tp_name. Either full table name, including the PowerCenter repository directory or view name

Version name for SQL and lookup views, otherwise empty

A and B source type: 1 = relational, 2 = flat file, 3 = SQL view, 4 = lookup view

A and B connection names

A and B owner names

A and B directory and file names if flat files

Run aggregations in database for A or B

A and B WHERE clauses

Run WHERE clause in database for A or B

Joined fields as one string Table pair external ID Indicates whether the table pair is active

rs_sv_id_view, rs_lv_id_view, and rs_jv_id_view


These views return the SQL view, lookup view, or join view IDs and columns for the querying criteria in the results_summary_view view. The view may contain duplicate IDs. You can use SELECT DISTINCT to avoid duplicate IDs.

rs_sv_id_view, rs_lv_id_view, and rs_jv_id_view

137

APPENDIX E

Metadata Import Syntax


This appendix includes the following topics:
Metadata Import Syntax Overview, 138 Table Pair with One Test, 138 Table Pair with an SQL View as a Source, 139 Table Pair with Two Flat Files, 139 Single-Table Constraint, 140 SQL View, 141 Lookup View, 141

Metadata Import Syntax Overview


The following sections display examples of metadata syntax definition.

Table Pair with One Test


<TablePair> Name = "CLIDETAIL_CLISTAGE" Description = "CLIDETAIL-CLISTAGE" ExternalID = "cliTest" SaveDetaliedBadRecords = true SaveBadRecordsTo = "SCHEMA" BadRecordsFileName = "" ParameterFile = "C:\\test_param.txt" <TableA> Name = "pc_repository/cli_demo/Sources/demo_connection/cliDetail" Connection = "dvo_demo_connection" WhereClause = "" WhereClauseDSQ = false InDB = false <TableB> Name = "pc_repository/cli_demo/Targets/cliStage" Connection = "dvo_demo_connection" WhereClause = "" WhereClauseDSQ = false InDB = false <Parameter> Name = "$$NewParameter1" Type = "string" Precision = "10"

138

Scale = "0" <TestCase> TestType = "AGG" Aggregate = "SUM" ColumnA = "ProductAmount" ColumnB = "CustomerAmount" Operator = "=" Comments = "" CaseInsensitive = false TrimRightWhitespace = false TreatNullsEqual = true

Table Pair with an SQL View as a Source


<TablePair> Name = "Joined_MYVIEW_FACTORDERS" Description = "Joined MYVIEW-FACTORDERS" ExternalID = "" <TableA> Name = "SQLView_470" WhereClause = "" WhereClauseDSQ = false InDB = false <TableB> Name = "pc_repository/dvo_demo/Targets/factOrders" Connection = "dvo_demo_connection" WhereClause = "" WhereClauseDSQ = false InDB = false <Join> ColumnA = "MyID" ColumnB = "LineID" ColumnA = "MyCurrency" ColumnB = "CurrencyName" <TestCase> TestType = "VALUE" ColumnA = "MyCurrency" ColumnB = "CurrencyName" Operator = "=" Comments = "" CaseInsensitive = false TrimRightWhitespace = true TreatNullsEqual = true <TestCase> TestType = "AGG" Aggregate = "SUM" <ExpressionA> Expression = "if(MyID>10,10,MyID)" Datatype = "integer" Precision = 10 Scale = 0 ColumnB = "LineID" Operator = "=" Comments = "" CaseInsensitive = false TrimRightWhitespace = false TreatNullsEqual = true

Table Pair with Two Flat Files


<TablePair> Name = "FLATFILE_FLATFILE" Description = "FLATFILE-FLATFILE"

Table Pair with an SQL View as a Source

139

ExternalID = "" SaveDetaliedBadRecords = true SaveBadRecordsTo = "FLAT_FILE" BadRecordsFileName = "test.txt" <TableA> Name = "pc_repository/dvo_demo/Sources/FlatFile/FlatFile" SourceDirectory = "C:\\FlatFile\\Sourcess" SourceFilename = "flatfile.txt" WhereClause = "" WhereClauseDSQ = false InDB = false <TableB> Name = "pc_repository/dvo_demo/Sources/FlatFile/FlatFile" SourceDirectory = "C:\\FlatFiles\\Targets" SourceFilename = "flatfile2.txt" WhereClause = "" WhereClauseDSQ = false InDB = false <TestCase> TestType = "SET_ANotInB" ColumnA = "order_name" ColumnB = "order_name" Operator = "=" Comments = "" CaseInsensitive = true TrimRightWhitespace = true TreatNullsEqual = true

Single-Table Constraint
<SingleTable> Name = "DIMEMPLOYEES" Description = "DIMEMPLOYEES" ExternalID = "" <TableA> Name = "pc_repository/dvo_demo/Targets/dimEmployees" Connection = "dvo_demo_connection" WhereClause = "" WhereClauseDSQ = false InDB = false <Key> ColumnA = "EmployeeID" <TestCase> TestType = "AGG" Aggregate = "COUNT" ColumnA = "EmployeeID" <ExpressionB> Expression = "100,200" Datatype = "integer" Precision = 10 Scale = 0 Operator = "Between" Comments = "" CaseInsensitive = false TrimRightWhitespace = false TreatNullsEqual = true <TestCase> TestType = "NOT_NULL" ColumnA = "LastName" ColumnB = "" Operator = "=" Comments = "" CaseInsensitive = false TrimRightWhitespace = false TreatNullsEqual = true

140

Appendix E: Metadata Import Syntax

SQL View
<SQLView> Name = "SQLView_991" Description = "MyView991" <Table> Name = "pc_repository/dvo_demo/Sources/demo_connection/srcOrders" Connection = "dvo_demo_connection" SQLQuery = "Select * from srcOrders" Comments = "This is a comment" <Columns> <Column> Name = "MyID" Datatype = "int" Precision = 10 Scale = 0 <Column> Name = "MyCurrency" Datatype = "varchar" Precision = 30 Scale = 0 <Column> Name = "MyAmount" Datatype = "decimal" Precision = 10 Scale = 2

Lookup View
<LookupView> Name = "LookupView" Description = "Lookup srcOrders --> dimProducts" <SourceTable> Name = "pc_repository/dvo_demo/Sources/demo_connection/srcOrders" Connection = "dvo_demo_connection" <LookupTable> Name = "pc_repository/dvo_demo/Targets/dimProducts" Connection = "dvo_demo_connection" <Join> ColumnA = "ProductID" ColumnB = "ProductID" ColumnA = "ProductName" ColumnB = "ProductName"

SQL View

141

APPENDIX F

Glossary
A
aggregate tests
Tests that check for lack of referential integrity in the source, incorrect ETL logic in the WHERE clauses, and row rejection by the target system.

B
bad records
Records that fail a test. You can view the bad records in the Reports tab. Aggregate tests do not display bad records.

C
constraint value
A representation of a constant value against which you can compare the field values.

count test
A test that compares the number of values in each field of the tables in a table pair.

D
data validation
The process of testing and validating data in a repository.

Data Validation Option repository


The database that stores the test metadata and test results.

Data Validation Option user


A user who creates and run tests in Data Validation Option. Data Validation Option stores the settings for each user in a unique user configuration directory.

DVOCmd
The command line program for Data Validation Option that you can use to do perform data validation tasks.

Data Validation Option folder


The folder in the Data Validation Option repository that stores the test metadata.

F
format test
A test that checks if the datatype of the fields in the source and target tables match.

I
inner join
A join that returns all rows from multiple tables where the join condition is met.

J
Join View
A join view is a virtual table that contains columns from related heterogeneous data sources joined by key columns. You can use a join view in a table pair or single table.

L
lookup view
A view that looks up a primary key value in a lookup table or reference table with a text value from a source. Lookup view stores the primary key in the target fact table. Lookup view allows you to test the validity of the lookup logic in your transformations.

O
outer join
A join that returns all rows from one table and those rows from a secondary table where the joined fields are equal.

S
single table
Data Validation Option object that references a database table or flat file in a PowerCenter repository. Use a single table to create tests that require data validation on a table.

SQL view
A set of fields created from several tables and several calculations in a query. You can use an SQL View as a table in a single table or table pair.

T
table pair
A pair of sources from the PowerCenter repository or lookup views and SQL views that you create in Data Validation Option. You can select a relational table or flat file.

Appendix F: Glossary

143

threshold
Numeric value that defines an acceptable margin of error for a test. You can enter a threshold for aggregate tests and for value tests with numeric datatypes. If the margin crosses the threshold, the record Data Validation Option marks the record as a bad record.

U
unique test
A test to check if the value in a field is unique.

V
value test
A test to compare the values for fields in each row of the tables in a table pair that determines if the values match.

144

Glossary

INDEX

A
architecture Data Validation Option 2 automatic generation compare folder 58 table pair 58

B
bad records single-table constraints 75 table pair tests 61 behavior changes in 9.1.0 Data Validation Option 11 behavior changes in 9.1.2.0 Data Validation Option 10 behavior changes in 9.1.4.0 Data Validation Option 10

C
client Data Validation Option 13 client layout Join Views tab 16 Lookup Views tab 15 SQL Views tab 15 Tests tab 15 configuration instructions for additional users 26 CopyFolder command syntax 104 CreateUserConfig command syntax 105

menus 18 new features 9 new features in 9.1.2.0 8 new features in 9.1.4.0 7 overview 1 required system permissions 21 Settings folder 19 system requirements 3 UNIX 29 upgrade 27 upgrading from version 3.0 27 upgrading from version 3.1 28 users 1 datatypes single table tests 118 table pair tests 117 DisableInformaticaAuthentication command syntax 106 DVOCmd CopyFolder command 104 CreateUserConfig command 105 DisableInformaticaAuthentication command 106 ExportMetadata command 106 ImportMetadata command 107 InstallTests command 107 location 103 overview 103 PurgeRuns command 109 RefreshRepository command 110 RunTests command 111 syntax 103 troubleshooting 116 UpgradeRepository command 113

E
ExportMetadata command syntax 106

D
data validation purpose 1 testing approach 3 typical workflow 2 Data Validation Option architecture 2 behavior changes in 9.1.0 11 behavior changes in 9.1.2,0 10 behavior changes in 9.1.4,0 10 client 13 configuration for additional users 26 installation for first user 22 installation overview 20 installation prerequisites 21 installation required information 22

F
folders copying 17 copying at the command line 104 overview 16 refreshing 37 restrictions on copying 16 Settings folder 19 troubleshooting copying folders 115

145

I
ImportMetadata command syntax 107 installation instructions for first user 22 overview 20 prerequisites 21 required information 22 required permissions 21 troubleshooting 114, 115 UNIX 29 upgrading from version 3.0 27 upgrading from version 3.1 28 InstallTests command syntax 107

metadata import syntax lookup view 141 single-table constraint 140 SQL view 141 table pair with an SQL view source 139 table pair with one test 138 table pair with two flat files 139

N
new features Data Validation Option 9 new features in 9.1.2.0 Data Validation Option 8 new features in 9.1.4.0 Data Validation Option 7

J
join views adding 88 overview 84 properties 85 WHERE clause 87 Join Views tab Data Validation Option 16 joins heterogeneous sources 83 table pairs 45

P
permissions Data Validation Option installation 21 prerequisites Data Validation Option installation 21 PurgeRuns command syntax 109

R
RefreshRepository command syntax 110 report views meta_ds_view 136 meta_jv_view 135 meta_lv_view 134 meta_sv_view 133 meta_tp_view 136 overview 129 results_id_view 133 results_summary_view 129 rs_bad_records_view 132 rs_jv_id_view 137 rs_lv_id_view 137 rs_sv_id_view 137 reports custom 95 Detailed Test Results-Bad Records example 123 Detailed Test Results-Test example 122 exporting 95 filtering 94 generating 93 lookup view definitions 94 overview 93 printing 95 scrolling 95 SQL view definitions 94 Summary of Testing Activities example 119 table of contents 95 Table Pair Summary example 120 troubleshooting 115 viewing 95 Reports Overview 93 repositories adding 36 deleting 37

L
lookup views adding 81 deleting 82 description 81 directory for file sources 81 editing 81 example 82 joining flat files 83 joining heterogeneous tables 83 metadata import syntax 141 name for file sources 81 overriding source table owner 81 overview 79 properties 80 selecting connections 81 selecting the lookup table 80 selecting the source table 80 source to lookup relationship 81 Lookup Views tab Data Validation Option 15

M
menus description 18 metadata export exporting at the command line 106 exporting objects 39 overview 38 metadata import importing at the command line 107 importing objects 39 overview 38

146

Index

editing 37 exporting metadata 38 overview 36 refreshing 37 refreshing at the command line 110 refreshing folders 37 saving 36 testing connections 36 upgrading at the command line 113 RunTests command cache settings 107, 111 send email 111 syntax 107, 111

overview 76 properties 76 retrieving data 78 SQL statement 78 table definitions 77 SQL Views tab Data Validation Option 15 system requirements Data Validation Option 3

T
table pair tests adding 57 automatic generation 58 bad records 61 comments 56 concatenating strings 56 conditions A and B 54 datatypes 117 deleting 57 descriptions 52 editing 57 excluding bad records 55 expression tips 56 fields A and B 53 filter condition 54 IF statements 56 ignoring case 55 margin of error 55 null values 56 operators 54 preparing at the command line 107 properties 52 purging test runs at the command line 109 running 58 running at the command line 111 substrings 56 threshold 55 trimming trailing spaces 55 troubleshooting 114, 115 types 51 using expressions for fields 56 values to test 53 table pairs adding 48 deleting 49 editing 49 joining tables 45 metadata import syntax , SQL view source 139 metadata import syntax, one test 138 metadata import syntax, two flat files 139 overview 41 processing large tables 43 properties 41 pushing logic to the source 44 test properties 52 test types 51 viewing test results 50 WHERE clause 44 testing and methodology comparing sources and targets 5 counts, sums, and aggregate tests 3 data testing approach 3 enforcing constraints on target tables 4 target table referential integrity 4 validating logic with SQL views 5

S
scripting metadata export and import 38 Settings folder displaying 19 single table tests adding 74 bad records 75 condition 72 constraint value 73 datatypes 118 deleting 75 descriptions 71 editing 74 field 72 filter condition 72 operators 72 overview 70 preparing at the command line 107 properties 70 purging test runs at the command line 109 running 75 running at the command line 111 troubleshooting 114, 115 values to test 72 single tables adding 67 deleting 68 editing 68 properties 62 test properties 70 viewing test results 69 single-table constraints adding 74 bad records 75 datatypes 118 deleting 75 editing 74 metadata import syntax 140 overview 62 properties 62 running 75 test properties 70 viewing test results 69 SQL views adding 78 adding comments 78 column definitions 77 connections 77 deleting 78 description 77 editing 78 metadata import syntax 141

Index

147

Tests tab Data Validation Option 15 troubleshooting command line errors 116 copying folders 115 initial errors 114 installation errors 114, 115 no test results 114, 115 ongoing errors 115 overview 114 report generation 115 repository connections 114 run errors 114, 115

UpgradeRepository command syntax 113 upgrading version 3.0 to current version 27 version 3.1 to current version 28 user configuration directory changing with a batch file 33 changing with an environment variable 33 creating at the command line 105 location 32 users Data Validation Option 1

U
UNIX Data Validation Option 29 upgrade Data Validation Option 3.0 27 Data Validation Option 3.1 27

W
WHERE clause join views 87 table pairs 44

148

Index

You might also like