Posts Tagged ‘Content Server’

kVisia: Product Overview

March 18, 2015 Comments off

Howdy Users and Buddies,

Today I want to share on kVisia, a product that provides the core functionality to automate, control and deliver an organization’s engineering and enterprise business processes to the desktop or Web. It holds the business rules, executes the XML configurations, builds the business forms, and implements the lifecycles that automate an organization’s business processes.

I think the only Prerequisites that you need to know is , Basic concept of client-server applications and Documentum Architecture.

Let me put it in Content Wise for better understanding with a bit of background of what’s there , what’s missing and how this can be achieved.

  1. Abstract

Today the current era is defined by the terms like data, information, knowledge, wisdom etc. and in this electronic world the difficulty to manage data in abundance has begotten many technologies and one of them is the Documentum.

Documentum is nothing other than a Content Management Tool but its vastness and its ability to cater to almost all sorts of data types available are so rich in itself that these days it is widely used. If we undo the Documentum technically then it gives an edge over the Relational Database Management System by not storing only the Meta data but also storing the content in its native format.

kVisia blends with Documentum to bolster the level and depth of automation which can be achieved by the Documentum alone and this is what this document discusses in detail.

  1. Introduction

kVisia comes of McLaren Software Limited and users need to have a license before using it. McLaren is ISO 9001 certified and it has been accredited Independent Software Vendor for Documentum and FileNet. It has got 360+ customers worldwide and it operates from UK, USA and Switzerland.

In fact kVisia is an Enterprise Engineer product that encapsulates best practices and domain expertise. This is a user configurable application consisting of XML files managed by repository and standardized on a single ECM platform. It is also believed to be powered by Documentum.

For better understanding of the kVisia we can go through the following questions and answers that will help one to identify the need.

How does it help me?

Very quickly configure the user interface and deliver it to the Desktop and Web users.

Is it easy to change?

Very easy, configuration files are stored in the Docbase in XML format.

Configurable interface using objects and tables.

Do the configuration files allow me to control inputs?

Very complex business rules can be enforced via the XML files.

Validations at creation, promotion and check in.

  1. kVisia suite

kVisia suite consists of following three components:

  1. McLaren Studio
  2. McLaren_Core DocApp
  3. McLaren_Foundation DocApp

3.1 McLaren Studio

McLaren Studio provides a user-friendly desktop application that allows users to create and edit XML configuration files that will conform to the rules of the Document Type Definition (DTD).

An XML (eXtensible Markup Language) configuration file is a structured file consisting of many elements. Elements are ordered into a tree structure or hierarchy. Elements may have parent or child elements, and may also contain attributes. An element without a parent element is the highest in a tree structure and is called a root element. Not all of the elements will have child elements.

The structure of a valid XML file is defined in a DTD (Document Type Definition) file. The DTD contains the rules that apply to each XML configuration file; these include the definitions of parent and child elements, element attributes, and how each element can be used

The appearance and behaviour of user dialogs can be modified without the need to change programme code simply by changing settings in the appropriate XML configuration file. A definition of the appearance and behaviour of a dialog is referred to as a “configuration”. A common need, however, is to display different configurations under different circumstances. For example, members of different groups may need to fill in different properties on documents, and the document’s properties may change as it moves from state to state in a lifecycle. The definition of such a scenario is referred to as a “mapping”. When you do not want to differentiate functionality between different groups, lifecycles and states, there is a value called {default} that will apply to all circumstances.

The Actions for which dialogs can be defined using an XML configuration file are:


When the user issues the New command, a customized dialog can be displayed. Commonly deployed features include automatic document numbering, interactive project allocation and any additional attributes (optional or mandatory), which may be required to be stored for the document.


A customized Properties dialog can be displayed. Fields can be made editable or read-only as required.


When the user performs an import, a customized dialog can be displayed which may typically employ features similar to those used in the New dialog.


When the user performs a copy and paste action, a customized dialog can be displayed which may typically employ features similar to those used in the New dialog.


For each document type, a search dialog referred to as QuickFind can be defined and viewed by the user in the Docbase through the McLaren menu. The attributes that can be searched on are configurable and search options such as Between, Greater Than, Containing, etc., can be specified.

 3.2 McLaren_Core DocApp

Having installed kVisia you get two sets of Docapps in the Docbase, one is the McLaren_core and the other one is McLaren_foundation. The McLaren_core Docapp is available to the user of a Documentum repository. It is always suggested that one should have the overview of the additional functionality as a result of the deployment of the kVisia before configuring the product to suit the business need.

As the name suggests McLaren_core Docapp provides the basic functionality like New, Import, Properties, Copy, QuickSearch etc to be configured. As discussed above, we use the XML configuration tool McLaren Studio to configure it.

For each Object Type used in the repository, an XML Configuration File can be created and configured to define the appearance and behavior of the various dialogs for documents of that Object Type. The definition of the appearance and behavior of a dialog is referred to as a configuration, and different configurations can be displayed in different circumstances.

3.3 McLaren_Foundation DocApp

The McLaren_foundation installation enables you to deploy a pre-configured implementation of the Core technology which transforms it into a user application specifically adapted to the engineering environment, providing a ready-for-use engineering repository. It deals with the following configurations:

User Roles that is associated with Menu Systems and User Dialogs that are adapted to those roles.

Object Types for documents and folders.

XML Configuration Files that define the content, appearance and behavior of the New, Copy, Properties, Import and QuickFind dialogs for the supplied Object Types in accordance with the current user’s User Role and the document’s lifecycle and state, including pre-configured, automatically generated document numbering.

Check in dialogs which are specifically adapted for each supplied type of engineering document.

Automated document revision numbering that reflects common design practice. Revision numbers start at 0 and the first issued version is 1, the second 2, the third 3, etc. When a document is checked in for the first time, a numeric value is assigned and incremented each time it is checked in.

  1. Traditional – VB (Desktop) Vs kVisia(Desktop & Web)

Following Comparison lets you know how much flexibility kVisia provides and how much time and effort it saves.

Traditional – VB (Desktop) kVisia (Desktop and Web)
Ø  Obtain VB source code.

Ø  Open VBP.

Ø  Edit GUI.

Ø  Add code to populate list.

Ø  Add validation rules.

Ø  Save source code.

Ø  Compile DLL.

Ø  Create CAB file.

Ø  Open DocApp in Composer

Ø  Edit the component ACX and

replace CAB file.

Ø  Check DocApp back in.

Ø  Not available until next user login.

Ø  Check out XML.

Ø  Edit XML kVisia Studio.

Ø  Add XML tag.

Ø  Specify properties.

Ø  Save and Check back in.

Ø  Change available immediately (no logout required).

  1. Benefits
  • Easy to manage development process as applications are configured via XML using Studio
  • Fast route from innovation to user acceptance
  • Shorter time from design through to development
  • Shorter time to deploy to the business users
  • Reduces development costs
  • Future upgrades are minimised
  • Simple deployment and at a local level of control
  • Speeds up implementation process through rapid configuration

Hope you like the post, Feel free to post your comments and I will reply back to any queries that you have.

Adios , Have a great day ahead…

Documentum Folder Bulk Export

February 10, 2013 Comments off

Hi Readers,

It’s been a while I was away from my blog…here I am back with all your requests.

Someone wrote a note to post something on Folder Export…Bringing it to you:

There is often a requirement of exporting an entire folder structure from a Documentum file repository to the local file system. The Documentum web applications allow export of only a single file not an entire folder structure with all files.

This document intends to illustrate the usage of a free tool ‘Documentum Deep Export Utility’ available on the EMC Developer Community site to achieve the objective of exporting an entire folder structure with all files from Documentum to the local file system.

This document also details the procedure for enabling Deep Export on Documentum 6.5 natively, enumerating the pros and cons of using the native functionality v/s the external utility.

Let me categorizes in sections:


The cabinet structure in Documentum is required to be replicated to the local file system, including the folder structure and all files. Export of metadata from Documentum is not included in scope.


  • Ensure that a fast data connection to the content server is available
  • Ensure that there is sufficient disk space for exporting the files
  • This utility has not been tested for docbases which have a very high degree of interlinked folders.

Enabling Native Deep Export Functionality on Documentum 6.5

This method will enable the native Deep Export functionality introduced in Documentum 6.5. This will enable a right click menu item in Webtop when the user right clicks on a folder. The menu item will be labeled as ‘Export’. The whole folder will be saved to the user specified directory sans the metadata.

  • Deep export of a folder having special characters (for example,:,?, <>, ʺ, |, *) are not supported. If you try to export a folder having such special characters, then the application throws an error.
  • Deep export of a hidden folder is allowed when you export a parent folder. If a folder is not visible in Webtop, then all the sub‑folders including hidden folders get exported during Deep export.
  • Deep export is supported for UCF content transfer, not HTTP content transfer.
  • Only the primary content is exported, no renditions are exported.
  • Only the current versions of documents are exported.
  • A VDM root and its children are exported starting from the level that is present in the folder selected for Deep export.
  • If the child of a VDM is in multiple folders, it is only exported once.
  • Deep export is trimmed down not to support VDM


Enabling Deep Export:

  • Go to the webapps folder where Webtop application is installed. If Apache Tomcat is being used the location would be “C:\Program Files\Apache Software Foundation\Tomcat 7.0\webapps\webtop”
  • In the Webtop folder, go to the ‘wdk’ folder and open the ‘app.xml’ file
  • Search for this text:

<!– Enable / Disable (true | false), the deep / folder export functionality. –>




Native Deep Export in D6.5 Webtop will be enabled.

Performing Deep Export via the Documentum Deep Export Utility

Setting up the Environment

Ensure that the DeepExport utility is unzipped to a folder on the system with a content server installation

  1. Register the SBO. Go to C:\Documentum\Config
    Append the following line to the file ‘DBOR.PROPERTIES’


  2. If the source server where the files are to be exported from is a remote server, you need to set the to point to the remote server. If the source server is local, no changes to the are necessary
    Set ‘
    Enter the name of the remote server in the ‘<content server>’
  3. Assuming deepExport utility is extracted in the ‘C:\ deepExportService’ folder
    Go to this folder and open setEnv.bat
    Set Java path (including quotes)
    eg. JAVA_HOME=” C:\j2sdk1.6.0_27″
  4. Set path of dctm.jar & dfc.jar. Please note that though path of config directory is not mentioned in the manual included with the utility, you also need to include it for correct operation of the utility. This has been included in the example below

         eg. DCTM_JAR=C:\Program Files\Documentum\dctm.jar;C:\Documentum\config

Run the batch file from a command line


Verify whether the path has been set correctly by typing echo %CLASSPATH% on the command line

Performing the Export

Go to the deepExport folder and edit the ‘testDeepExport.bat’

  1. Set the username and password to the username and password of a user of the docbase. Ensure that the user has sufficient permissions
  2. Set the docbase name variable
  3. Set the docbase source folder name variable
  4. Set the export folder on the local system
    Please note that the path has all double backslashes and ensure that sufficient space is present on the system
  5. Execute the batch file



The process takes quite some time. When the command prompt returns the utility would have finished the export.

Comparison of Utility v/s Native export

The advantage of the utility based export is that no changes are required to the documentum environment itself. The export can even be done remotely by setting the as instructed above. On the other hand, native export functionality is recommended if the export feature has to be provided to the end users as a permanent feature.

For more information , check out the below link:



Documentum Full Text Index Server

November 2, 2011 Comments off

For faster and better search functionalities EMC has developed a Full Text Index Server which is installed separately with the Content management software to provide an index based search capability. In version 5.2.5 SPx the full text search engine was using Verity which has been now changed to FAST (Fast Search & Transfer) in 5.3 SPx onwards and xPlore replaced further.

In Verity we have to explicitly define the attributes to be indexed, in the content server configuration, whereas one of the salient features of FAST is that by default, all the attributes are indexed along with the content of the document. Since, FAST is no longer tightly coupled with the installation of the content server; one has the option of not installing Index
 Server. If the Full Text Index server is not installed, simple search will perform a case sensitive database search against object_name, title and subject attributes of dm_sysobject and its subtypes.

This post describes the various components of Index Server and their operations.

1.   Software Components

Full-text indexing in a Documentum repository is controlled by three software components:

  • Content Server, which manages the objects in a repository, generates the events that trigger  full-text indexing operations, queries the full-text indexes, and returns query results to client applications.
  • The index agent, which exports documents from a repository and prepares them for indexing.
  • The index server, which is a third-party server product that creates and maintains the full-text index for a repository. The index server also receives full-text queries from Content Server and responds to those queries.

2.   Set Up Configuration

a)  Basic Set Up

The basic indexing model consists of a single index agent and index server supporting a single repository. The index agent and index server may be installed on the Content Server host or on a different host.

b)  Consolidated Set Up

In a consolidated deployment, a single index server provides search and indexing services to multiple repositories. The repositories may be in the same Content Server installation or on different hosts. However, all repositories must be of the same Content Server version.

3.   Index Server Processes

The index server consists of five groups of processes that have different functions.

a) Document processors

Document processors (also sometimes called procservers) extract indexable content from content files, convert DFTXML to FIXML (a format that is used directly by the indexer), and merge the indexable content with the metadata during the DFTXML conversion process. Document processors are the largest consumer of CPU power in the index server.

b) Indexer

The indexer creates the searchable full-text index from the intermediate FIXML format. It consists of two processes. The frtsobj process interfaces with the document processor and spawns different findex processes as necessary to build the index from FIXML.

c) Query and Results servers

The QR Server (Query and Results Server) is a permanently-running process that accepts queries from Content Server, passes queries to the fsearch processes, and merges the results when there are multiple fsearch processes running.

The index server can run in continuous mode or in a special mode called suspended mode. In suspended mode, FIXML is generated for any updates
to the index but not integrated into the index. When the index server is taken out of suspended mode, the index is updated. Running in suspended mode; speeds up the indexing process. Suspended mode should be used when the requirement is to index large volume of documents or to re-index an entire repository.

4.   Health check up for Index Server processes

Execute the following command through command prompt

nctrl sysstatus

This will list all the Index Server process with their status.

Another option is to use Index Server admin console through <a href=”http://localhost:/admin”>http://localhost:<portno.>/admin and navigate to “System Management” tab.

Navigate to “Matching Engines” tab for details on total no. of documents in all the filestores (if there are multiple repositories) and the no. of documents processed by Index Server. It also provides a link to Index Server log file.

5.   How to determine Index Server ports

Using the Index server base port we can determine the ports for various Index Server processes

Index Server admin console: Base Port + 3000

FAST Search console: Base Port + 2100

6.   Fulltext indexing queue messages

When a document has been marked and submitted for fulltext indexing, it is queued to the Index Agent/Index Server.

The fulltext index status can be checked by the following dql query:

select sent_by, date_sent, item_name, content_type, task_state, message from dmi_queue_item where item_id = ”

The task_state can have one of the following values:

‘’ – The item is available to be picked up by an Index Agent for indexing.

‘acquired’ – The item is being processed. If an Index Agent stops abruptly, a queue item can be left in this state until the Index Agent is restarted or an Administrator clears the queue item.

‘warning’ – The item was indexed with a warning. Often it indicates that the content of the object failed to index but the meta-data was successfully indexed.

The ‘message’ attribute and the Index Agent log will have further details.

‘failed’ – The item failed to index, please refer to the ‘message’ attribute and the Index Agent log for more information.

‘done’ – Successfully indexed the item

7.   Index Agent Modes

An index agent may run in one of three operational modes:


In normal mode, the index agent process index queue items and prepares the SysObjects associated with the queue items for indexing. When the index agent successfully submits the object for indexing, the index agent deletes the queue item from the repository. If the object is not submitted successfully, the queue item remains in the repository and the error or warning generated by the attempt to index the object is stored in the queue item.


In migration mode, the index agent processes all SysObjects in a repository sequentially in r_object_id order and prepares them for indexing. A special queue item, the high-water mark queue item, is used to mark the index agent’s progress in the repository.

An index agent in normal mode and an index agent in migration mode cannot simultaneously update the same index.


In file mode, a file is used to submit a list of objects IDs to the index agent when a new index is created and index verification determines which objects are missing from the index.

8.   Switching modes of Index Agent

At the time of Index Agent set up the wizard gives an option to start the Index Agent under “Normal” or “Migration” mode.

The following steps should be performed to change the Index Agent from one mode to another.

1. Login to Index Agent Admin console through http://localhost:<index agent port no.>/IndexAgent<no.>/login.jsp

e.g., http://localhost:9081/IndexAgent1/login.jsp

2. Stop the Index Agent

3. Now change the Index Agent mode from Normal to Migration or Migration to Normal as the case may be.

4. Click on OK

5. Start the Index Agent again.


i) While in Migration mode Index Agent doesn’t appear in DA under Indexing Management tab. On the Index Agent admin screen it will provide the details of the no. of documents processed out of the total no. of documents.

ii) If the Index Agent service is restarted from services console then start the Index Agent from Index Agent Admin console or through DA under Indexing Management tab.

9.   Re-configuring Index Agent and FAST

Configuring another IA and FAST to a repository previously configured to work with one IA and FAST doesn’t modify dm_ftengine_config object and IA fails to start displaying error to connect to old FAST machine.

To resolve:

Manually update the dm_ftengine_config object based on the settings from the new machine

1. Go to IAPI and execute the following API –

iapi> retrieve,c,dm_ftengine_config

2. Note the object_id retrieved by the above API and use it to execute the following API –

iapi> dump,c,l

3. In the dump results note the following param_name, param_value pairs

fds_base_port should match 13000 or the base port number for Index Server Install

fds_config_host should match the host name where the Index Server is installed.

and so on….

4. The param_name/param_value pairs should be changed to match the values for the index server install.

5. Delete the following via dql:

delete dm_ftengine_config object where r_object_id = ‘old_value’

delete dm_ftindex_agent_config object where r_object_id = ‘old_value’

6. Run the index agent configuration program to create new index agent.

10.   Relocating fulltext indexes in Index Server

The following steps describe how we can change the location of fulltext indexes

 1. Shutdown the Index Agent

 2. Shutdown the Index Server

 3. Copy the indexes to the target location ( both the fixml and the index directories)

 4. Default location of the fixml and index directories:

      for Windows – %DOCUMENTUM%/data/fulltext

      for Unix –$DOCUMENTUM/data/fulltext

 5. Edit the following:

 In Windows:  %DOCUMENTUM/%fulltext/IndexServer/etc/searchrc-1.xml. Change the “index path”

 In Unix: $DOCUMENTUM/fulltext/IndexServer/etc/searchrc-1.xml. Change the “index path”

6. Edit the following:

 In Windows: %DOCUMENTUM%/fulltext/IndexServer/etc/config_data/RTSearch/ webcluster/rtsearchrc.xml. Change fixmlpath and fsearchdatasetdir to the new path

 In Unix: $DOCUMENTUM/fulltext/IndexServer/etc/config_data/RTSearch/webcluster/rtsearchrc.xml. Change fixmlpath and fsearchdatasetdir to the new path

 7. Startup IndexServer, Index Agent

11.   dm_FTCreateEvents Job

The Create Full-Text Events tool (dm_FTCreateEvents) may be used in two ways:

 a) To complete an upgrade by causing any objects missed by the pre-upgrade indexing operations to be indexed.

 The job generates events for each index able object added to a repository between the time a new 5.3 or later full-text index is created for a 5.2.5 repository and when the repository is upgraded to 5.3.

 This is the out-of-the-box behavior of the job.

 b) To generate the events required to re-index an entire 5.3 SP1 or later repository.

 Re-indexing the repository does not require deleting the existing index.

Please refer to the screenshot for the configuration of dm_FTCreateEvents Job –

To generate the events required to re-index an entire 5.3 SPx or later repository the -full_reindex argument must be set to TRUE to generate the required events.

The first time the job runs in its default mode, the job determines the last object indexed by an index agent running in migration mode and the date on which that object was indexed. The job searches for objects modified after that date and before the job runs for the first time and generates events for those objects. On its subsequent iterations, the job searches for objects modified after the end of the last iteration and before the beginning of the current iteration.

Before the job is run in a 5.3 SP1 or later repository with argument –full_reindex set to TRUE, you must create a high-water-mark queue item (dmi_queue_item) manually using the API –



and specify the r_object_id of the queue item as the -high_water_mark_id argument of the dm_FTCreateEvents Job.

In case you get the following error message in the job’s report –

FTCreateEvents was aborted. Error happened while processing job. Error: No high water mark found for qualification:

Verify the -high_water_mark_id attribute and check whether the API was executed after installation or re-installation of the index server to get the required r_object_id argument.    

Disable the job if the application is not using Full Text Index Server.

The job can also be de-activated if the following events are registered for dm_fulltext_index_user’

  • dm_save
  • dm_destroy
  • dm_readonlysave
  • dm_checkin
  • dm_move_content

Execute the following query to verify the same:

select event from dmi_registry where user_name=’dm_fulltext_index_user’ 

12.   Using FT Integrity Tool

Modify the parameter file:

a) Login to the index server. Navigate to the parameter file location.
On Windows, Drive:\ProgramFiles\Documentum\IndexAgents\IndexAgentN\webapps\IndexAgentN

b) Open the ftintegrity.params.txt file in a text editor.

The first line is  -D repositoryname

where repositoryname is the repository for which you created a new index.

c) Add the following two lines immediately after the first line

-U username

-P password

where username is the user name of the Superuser whose account was used to install the index agent and password is the Superuser’s password.

 d) Save the  ftintegrity.params.txt file to %Documentum%\fulltext\IndexServer\bin  (Windows).

Sample FT Integrity params file

Note: The instruction above save a Superuser name and password to the file system in a plain text parameter file. For security reasons, you may wish to remove that information from the file after running the FTIntegrity tool. It is recommended that you save the parameter file in a location accessible only to the repository Superuser and installation owner.

 To run the index verification tool:

  1. Navigate to %Documentum%\fulltext\IndexServer\bin (Windows) or
  2. To verify both completeness and accuracy, open a command prompt and execute

           cobra -i ftintegrity.params.txt -m b

  1. To verify completenes only, open a command prompt and execute

           cobra -i ftintegrity.params.txt -m c

  1. To verify accuracy only and query all indexed objects, open a command prompt and execute

           cobra -i ftintegrity.params.txt -m a

FT Integrity generates 3 reports

res-comp-common.txt – object id of all documents that are found in both index and repository.

res-comp-dctmonly.txt – object id of documents that are in repository but not indexed

res-comp-fastonly.txt –object id of documents in index but not in repository.

It also generates ftintegrityoutput.txt file which is nothing but the console output generated in the text format.

 To resubmit objects that failed indexing:

a) Navigate to %DOCUMENTUM\fulltext\indexerver\bin.

b) Copy the res-comp-dctmonly.txt file to

drive:\Program Files\Documentum\IndexAgents\IndexAgentN\webapps\IndexAgentN\WEBINF\classes.

c) Rename the res-comp-dctmonly.txt file to ids.txt

The index agent periodically checks for the existence of ids.txt. If the file is found, the objects are resubmitted for indexing.

13.   Moving From Index Server 5.3 SPx to D6.5 SPx

 a) Index Server D6.5 can use indexes created by a 5.3 SPx Index Server. This reduces the overhead of indexing the entire repository during upgrade.

 While uninstalling Index Server 5.3 SPx leave the checkbox for deleting the Indexes unchecked. Then during D6.5 Index Server installation point it to the existing indices folder as data folder.

b) While upgrading from 5.3 SPx to D6.5 Index Server the data folder, if the indices from 5.3 are preserved, should be on the same drive as the Index Server home directory.

c) The path for attribute mapping xml is $Documentum\ fulltext\fast as against $Documentum\fulltext\fast40 folder in 5.3 SPx

 This path should be correct in dm_ftengine_config object for index based search to function properly.

d) Index Agent in D6.5 uses 20 consecutive ports as against 1 port in 5.3 SPx. i.e. If Index Agent 1 is running at port 9081 then Index Agent 2 cannot take any ports from 9081 to 9100.

e) D6.5 doesn’t provide an option to switch Index Agent between different modes using Index Agent admin console.

f) The params file for using FT Integrity tool is placed under $Documentum\jboss4.2.0\server\DctmServer_IndexAgentN\deploy\IndexAgentN.war

Accordingly the ids.txt file should be placed under $Documentum\jboss4.2.0\server\DctmServer_IndexAgentN\deploy\IndexAgentN.war\WEB-INF\classes

Hope this Post is useful for all those developers who needed more and step by step info on Index Servers.

Publish Content to a website using Documentum Web Publisher

September 30, 2011 1 comment

Today’s global companies produce an enormous amount of content.  Web sites and portals are the first avenue to distribute this business information to all major stakeholders. Web content management has become a primary strategy to help organizations communicate more effectively with their key audiences. Ineffective web content management can significantly undermine company messaging, decrease sales, increase staffing requirements, and raise operational costs and risks.

This post briefly discusses the need for web content management and the solution provided by documentum. It also describes how to publish content to a website using Documentum Web Publisher.

Documentum Web content management solution

Documentum provides an enterprise content management approach for managing all unstructured data including documents, web pages, XML, and rich media throughout the organization. Documentum web content management system is built on this underlying architecture to support management and publishing of all unstructured content types.  It can drive down costs, simplify the management of multiple sites, and increase productivity for the creation, approval, and publishing of content, ultimately delivering a superior user experience to raise customer satisfaction and revenues.

  Key components:

  • Web Publisher

Web Publisher is a browser-based application that simplifies and automates the creation, review, and publication of web content. It works within Documentum 5, using Documentum Content Server to store and process content. It uses Documentum Site Caching Services (SCS) to publish content to web.  Web Publisher manages web content through its entire life: creation, review, approval, publishing and archiving. Web Publisher also includes a full complement of capabilities for global site management, faster web development using templates and presentation files, administration. Web Publisher can be integrated with authoring tools to develop web sites. 

  • Documentum Content Server

Content Server stores content and meta data in a repository called docbase. It provides full set of content management services, including library services (check in and check out), version control, archiving options and process management features such as workflows and lifecycles. It also provides secure access to the content stored in the repository.

  • Site Caching Services

Documentum Site Caching Services  (SCS) publish documents directly from a docbase to a web site. It extends the capabilities of content server. It has two components, source software and target software. The source software has to be installed on content server host and target software on web server. SCS chooses which content to publish and to what location according to the parameters in a publishing configuration. User can create publishing configurations in Documentum Administrator.

  • Site Deployment Services

Site Deployment Services, retrieves the web site from the Site Caching Services repository and deploys the site to multiple servers or Internet Service Providers.

Related Solutions:

The Documentum web content management solution is strengthened by complementary products that address more sophisticated web challenges such as rich media authoring environments, better searching and navigation, better collaboration, portals, and records management for compliance.

  • Content Intelligent Services

            Content Intelligent Services provides better searching, navigation and personalization for the large amount of web content.

  • Content Rendition Services

Content Rendition Services automates the conversion of standard desktop document formats into Web-ready formats such as PDF and HTML and stores the renditions in a Documentum repository alongside the original.

  • Content Media Services

Content Media Services performs all analysis and transformations activities for any media file format.

  • Web Publisher Portlets

This is the portal solution offered by documentum.  Web Publisher portlets allow users to participate in fundamental, content-based business processes without leaving their familiar portal environments or learning a new application and include three out-of-the-box portlets that are also fully customizable: My Web Publisher, Submit Content, and Published Content.

  • My Web Publisher – provides a personalized view, allowing users to easily see vital information such as the number of unread tasks and notifications
  • Submit Content – displays all Web Publisher templates that an end user is allowed to access and summarizes files that have been created, published, and checked out
  • Published Content – provides users with a list of documents in an active published state and grouped by category such as announcements, corporate news, or human resources (HR)
  • Record Management Solution – Record management solution helps companies to comply with regulations governing electronic information.
  • eRoom –  eRoom is a web based collaboration tool that allows people to work together on content, projects, and processes both within the enterprise and beyond. This may include external entities such as partners, suppliers, customers, and clients.
  • Inter enterprise workflow services – With Inter enterprise workflow services, documentum workflows can be extended across a company’s firewall to include business partners. It also enables integration of documentum workflows with workflow engines, including EAI and BPM systems as well as with workflows from other enterprise applications. 

Publishing Content to a Website

Traditionally web teams or IT departments manage web sites manually. Web teams are overwhelmed with demands to constantly publish new content and ensure higher quality standards while managing hundreds of thousands of web pages across external websites and portals. To overcome these challenges, organizations need to empower business content owners to author and publish content. Providing content templates can help in maintaining brand integrity across all sites. Companies should also ensure that content is reviewed and approved before it is published.  Organizations can achieve consistency and quality with out burdening web teams with costly and time intensive manual updates by having a web content management solution.  We will see how this can be achieved using documentum as a web content management tool.

Web teams can create web sites using Web Publisher. Web sites created are stored in web cabinets. Web administrator can create groups, which define specific job functions like content authoring, reviewing and add users to these groups. Web developers can design content templates, assign a life cycle and workflow to the template and make it available for use via web using Web Publisher. Life cycle identifies the state of a document.  Web Publisher default life cycle has following states: Start, WIP, Staging, Approved and Active.                       

  • Start

When content is newly created or newly versioned, Web Publisher places it in the Start state, for initialization purposes, and then immediately promotes it to the WIP state.

  • WIP (Work In Progress)

Content in draft or review.

  • Staging

Content that is complete and ready for testing on a staging Web site. By default, Web Publisher does not allow users to modify a file’s content, location or properties if the file has advanced to the Staging state or beyond.

  • Approved

Content that is approved for the active Web site but has not yet reached its publication date (i.e., effective date).

  • Active

Content that is on the active Web site.

A workflow defines activities to be performed on content. It defines the users who will perform the set of activities. Workflow can also include automatic tasks, which are performed by the system. For example, an automatic task might promote a file to a new lifecycle state. Using Web Publisher, web teams can create workflow templates, which can be later be reused for any content type.

Content authors can create content based on content templates to which a life cycle or workflow is assigned. The content templates help companies to maintain brand integrity. Review and approval of content can be automated using workflow. Thus enterprises can control what content is created, by whom, and in what manner.

Once the content is approved it has to be published. Content owners can publish content using Web Publisher. For this, a publishing configuration has to be created for the web site. This is done using documentum administrator. User can have separate publishing configurations for different life cycle stages (WIP, Staging and Active stages) for each web cabinet. Each publishing configuration publishes to a separate target location: the WIP and Staging sites are for internal testing; the Active site is the live web site. Users would access the WIP and Staging sites through the Web Publisher preview command or a URL.

If a web site is created in multiple file formats or languages, use the publishing configuration to determine what format or language is published to a given web server. For example, suppose product.htm has three renditions: product.htm, product.xml, and product.wml. User can create two publishing configurations for the site: one that publishes HTML, GIF, and CSS files, and another that publishes WML files (Product.xml is used for development and is not published).

When a publishing configuration is created, SCS automatically creates a publishing job.

         SCS publish operation can be initiated when any of the following occur:

  • When the publishing job’s regular interval occurs.
  • When a user manually publishes content through the Publish command.
  • When content is manually or automatically promoted to Staging or Active state. Promotion initiates the publishing operation only if the web site is configured to use synchronous publishing. Manual promotion occurs when a user either promotes content to the next lifecycle state or power promotes content to the Approved lifecycle state. Automatic promotion occurs when Web Publisher promotes content through an automatic workflow task or through the arrival of the content’s effective date. If a web page reaches the approved state after the effective date is met, the page is published the next time the site is updated.
  • When a user previews content in the WIP or Staging states to see how it will appear on the web. Web Publisher initiates the publishing operation if the content has been modified since the last publishing job ran.

Web Publisher removes web pages from web sites when the pages meet their expiration dates. Steps to create and publish content to a website   are given below.

Steps to Create and Publish Content to Website

  1.       Log in to Web Publisher as administrator.
  2.       Go to Administration->User Management-> Users
  3.       Add users to docbase. Refer Web Publisher help for more details.
  4.       Add users to content author, content manager, administrator groups
  5.       Assign the Client Capability of content author user to contributor.
  6.       Assign the Client Capability of content manager user to coordinator.
  7.       Assign the Client Capability of administrator to system administrator.
  8.      Create a workflow template using workflow manager.  User can either use the desktop version of workflow              manager or web version that can be accessed from Web Publisher.

Web Publisher provides default workflow templates. For e.g. Submit to Web site, which is a simple workflow, is used to publish content to a web site. When starting this workflow, the content manager specifies an author to work on the content. The task appears in content author’s inbox. Content author modifies the content and forwards it to the manager for review. Web Publisher promotes the content from WIP to Staging prior to review. If the reviewer rejects the task, Web Publisher demotes it to WIP and routes it back to its originator. If a reviewer approves the task, Web Publisher routes the content to an approver and the content is promoted to Approved Stage. Once content is approved it is automatically promoted to Active stage and is published to the website.

To start creation of a workflow template using desktop version of workflow manager

a) Open Workflow Manager.

b) Log in to Workflow Manager as Web Publisher administrator. A Web Publisher administrator should have superuser permissions.

c) Choose File->Open. Browse to System->Applications->Web Publisher folder and select a Web Publisher default workflow.

    For example, Submit to Web site. This opens a default Web Publisher workflow on which to base custom workflow.

d) Choose File->Save As and save the workflow with a name that represents the workflow.

                All workflows must be saved in System->Applications->WebPublisher-><user_defined_workflow_folder>.

               Create a new folder or save workflows to the Web Publisher root folder.

e) Validate the template.

f) Install the new workflow template

g) Make it available through Web Publisher.

For more information on creating workflows, refer to Workflow Manager User Guide.

To access workflow manager from Web Publisher, log in to Web Publisher as a Web Publisher administrator.

Select Web Publisher Admin->Workflow templates and repeat the steps from c to g

9.     Create a new category under Templates in Web Publisher

          10.   Import a template to the new category that is created. Template provides layout for content.

          11.    Assign default life cycle and the newly created workflow to the template.

          12.    Make the template available for use.

          13.   Create a web cabinet in Web Publisher

          14.   Create a folder in web cabinet

          15.   Create content using the newly created template in Web Publisher

            16.  Create a site publishing configuration in documentum administrator

                          i) Start Documentum Administrator and connect to the docbase as a superuser.

                          ii) Click Site Publishing.

                          iii) Create a new site-publishing configuration.

                          iv) Set values in the site-publishing configuration:

                             a) Click Active.

                             b) In the Version field, type Active.

                             c) Click Publishing Folder and browse the Docbase to the website folder (web cabinet).

                             d) Type the target host name. This is the host where the SCS target software is installed.

                             e) Type the target port for making connections to the target host. The port entered must match the port          

                                 specified at the time of installation of SCS (DefaultPort, 2789)

                             f) Type the target root directory to which user wants to publish the content. To publish web pages to

                                  Apache Tomcat, give this as target directory:

                                         C:\Program Files\Apache Group\Tomcat 4.1\webapps\ROOT

                            g)  Choose the connection type.

                            h)  Click the Advanced tab.

                            i)  Select a new export directory or leave the default unchanged.

                            j)  Type the transfer user name, password, and domain. Enter the transfer authentication domain name that

                                 was provided during SCS target installation. Enter a valid username and password.

                           k) Click Ok.

17.  Log in to Web Publisher as content manager

18.  Start the new workflow using Web Publisher

a.      Navigate to the content created.

b.        Select the checkbox.

c.         Go to Tools

d.        Select Workflow -> Start  

e.      Assign a content author and web admin

19.  Workflow tasks will appear in user’s inbox

20.  User can accept and forward the task to next user or reject it

21.  Content is automatically published to web site once it is approved.

I would conclude in writing  Documentum provides an enterprise approach to transform online presence and drive ROI. It provides an easy-to-use, browser-based interface that empowers non-technical users to easily create, manage, and publish content for multilingual web sites and portals. This solution is suitable for medium to large size companies.

%d bloggers like this: