SQL Server
Issue with using the Dmsss26.so of SQL Server Database for loading the data
July 3, 2017
Master Data Managment
Why MDM?
July 24, 2017
Show all

Post Deployment Validations

Post Deployment Steps in Informatica Power Center

Post Deployment Steps in Informatica Power Center

Based on the architecture, application, design, deployment cycle and the environment (technical setup) there will be more aspects to check, consider and verify before, during and post the deployment.

Backups: In any case, always take a backup of all the objects and application’s code before doing any deployment. If something goes wrong, I will be easy to revert the content instead of keeping the system unavailable and debug it. This is applicable to DB/UNIX scripts, Informatica Objects and other dependent application code.

Code Comparison:

Code compare between original code and changed code to verify differences. This can be done at the folder level once the deployment is complete. If you have automated the deployment using the command options then you can also include this in the script.

A manual check is also needed based on the criticality of the change and the manner.

This does not only include the Informatica objects, but also the Database scripts, UNIX scripts, Parameter files and any other application related/dependent code and objects.

Relational Source/Target:

There can be a lot of scenarios here, to name a few

  • Loading the test data into the sources/reference tables
  • Truncating the target tables
  • Revalidating the data existing in the tables
  • Adding a new attribute and the value to it

Yes, these should be part of the Database scripts and should be done before the Informatica code deployment. There might be some post deployment scripts which must be done as well. Of course, the development team has to provide you the details. Â

Connections: 

Needless to say, check all the connections to point to the standard $DBConnection_** for source tables and $DBConnection_STAGE for staging tables (if you have) and $DBConnection_AUDIT for Audit tables (if you have). Try and use $Source and $Target accordingly instead of direct hardcoded values. Best practice is to not hardcode any connection.

Parameters:

While adding new session to your workflow, make sure to add the session details in the required configuration tables and the parameter files and prepare the insert/update scripts (I any required). You need the database scripts only if you are running the workflows thru the process tables driven by the data model.

Parameter Path: Make sure that the parameter file paths are updated as per the environment (DEV/QA/UAT/PROD) post the deployment. Applicable to workflows and sessions.

Tracing Level: This should always be set to ‘Normal’, until unless you/developer team sets it to Verbose data for debugging purposes. Setting this property anything else apart from ‘Normal’ will increase the session log file size and if there is not enough space the Integration Services fails and that should follows the cleanup steps.

Fail Parent: This should mostly be a task of the development team to make sure this is checked. This can also be a task of the QA team to verify this along with the naming standards to be followed and set.

Pre-and Post SQL: Check the pre/post session sql’s related to the changes of the environment and this can be identified by a test run.

Code Check-in: Of course, last but not the least the code should be checked in.

If you are following other methodologies to make the deployments validate better. Do let us know.

Leave a Reply

Your email address will not be published. Required fields are marked *

Share
+1
Tweet
Pin
Share