How a stronger technology focus contributes to faster delivery of business value

    How a stronger technology focus with TFS and Powershell optimizes continuous development processes and contributes to faster delivery of business value.

    Subscribe to our blog

    As a Captiva Consultant and DevOps Evangelist, I am always looking for ways to improve the team’s way of working and speed up the delivery process. Automation is key.

    In DevOps, a continuous delivery environment is set up to optimize development processes: it focuses on rapid development and deployment of bite-sized pieces of software and automated testing and monitoring. This way, the development or operations team get its feedback faster and can react faster to new business needs.

    Both Team Foundation server (TFS), Microsoft's product that provides source code management, and Powershell, a task automation and configuration management framework also from Microsoft, have all the tools and features needed to do this. All aspects of moving a piece of code from development to production can be automated with Powershell. Even the tests can be written in Powershell. This makes it incredibly versatile and powerful.

    The OpenText Captiva products are used to automate document capture and enterprise capture – in other words, to automatically digitize paper documents, such as files, contracts, invoices or letters.

    How we set up a continuous delivery process

    In a previous blog I explained a continuous delivery process using GIT, Artifactory, Jenkins, Nolio and HP Unified Functional Testing (UFT). Since then, the team updated the release/deploy process to use Microsoft Team Foundation server (TFS) instead of Jenkins, and Powershell replaced Nolio, looking for more technology focus and uniform stack.

    Continuous delivery process - The team updated the release/deploy process to use Microsoft Team Foundation server (TFS) instead of Jenkins, and Powershell replaced Nolio, looking for more technology focus and uniform stack.

    These are the main steps in this set-up:

    1. Developers work at their workstations with Captiva Designer or IDE and commit code to the GIT repository
    2. Creation of a build (Artifact) in TFS
    3. Automated testing using UFT and Powershell
    4. Release with TFS and Powershell

    Step 1: Development in Captiva Designer and commit to GIT

    The first step in the process is the developers working on their workstations. They use the Captiva Designer to make changes to the development environment as they please. When they are done, the developers commit their code to the GIT repository. GIT is primarily used for version control in this set-up.

    Step 2: Creation of a TFS Build

    In this second step, there is a simple set-up. TFS copies the latest committed code to a central location and creates an artifact.  This container ensures that only the right items are used and deployed.

    Step 3: Using UFT and Powershell for automated testing

    At the end of the deployment process, we kick off an automated test using UFT or Powershell combined with Pester, a unit testing framework for PowerShell. This test creates a batch by importing images, then indexes them and verifies the data in Captiva.

    Step 4:  Release with TFS and Powershell

    TFS is a configurable release management tool that can be used to deploy to multiple environments and create the automatic deployment scripts, replacing the manual cookbook.

    Captiva only runs on Windows. In an attempt to simplify our stack, we moved from Nolio and Jenkins to the TFS release management tool, while migrating all Nolio configurations with Powershell. This enabled us to more easily treat our dev pipeline as code.


    • We create a folder structure on a server, set and test authorizations
    • We needed to run a Captiva tool from command line, restart services and then run a combination of UFT and Powershell tests to validate the environment.

    If we change the deploy/release procedure, we commit a new version and deploy it through a TFS pipeline.


    From a DevOps perspective, we didn't need to move away from Nolio and Jenkins. But this move has helped us to simplify the stack (excluding java based code) from our servers which means less complexity, maintenance and upgrades.

    Our stronger technology focus resulted in better code and faster integration and delivery of business value. It also gave us the opportunity to treat everything as code: coding activities, infra management and authorization items, monitoring activities, etc.

    You can also follow Dennis Van Aelst on the EMC Community blog.

    Published on    Last updated on 01/08/2018

    #Digital Strategy, #Software Development

    About the author

    Dennis Van Aelst is an ECM consultant and Team Coach at Amplexor, based in The Netherlands. As OpenText Captiva product specialist, he focuses on structuring, automating and managing business processes. Uses Agile and DevOps continuous delivery best practices, Dennis creates team working environments where business continuity, transparency and human capital come first.