Archive

Posts Tagged ‘DFC’

Relationship and Virtual Documents in Documentum

November 26, 2011 Comments off

This Post explains how the concept of relationships and virtual documents can be used to relate objects in Documentum using Documentum Foundation Classes (DFC).

Definitions:

  1. Relationship: A relationship implies a connection between two objects. When an object is related to another object then we can define which object is the parent object or the child object or if they are equal. Relationships are system-defined as well as user-defined. In the BOK, we confine our self to user-defined relationships.
  2. Virtual Document: In Documentum, a document which holds other documents i.e. a document which acts as a container is called as a virtual document. A virtual document can contain other virtual documents. The document which acts as a container is called the parent document while the documents which are contained in parent document are called as child documents.

Overview on relationship:

Two built-in object types dm_relation and dm_relation_type are to be used to create relations between any two objects. The dm_relation_type object defines the behavior of the relation. The dm_relation object identifies the dm_relation_type object and the two objects between which the relation needs to be created. Pictorially it can be shown as:

Figure-1

The dm_relation_type object has following attributes:

  1. child_parent_label: It defines the child to parent feature of the relation.
  2. parent_child_label: It defines the parent to child feature of the relation.
  3. description: It gives the general description of the relation.
  4. parent_type: It defines the type of objects that will act as parents.
  5. child_type: It defines the type of objects that will act as child.
  6. direction_kind: It defines the nature of relationships between the objects. The expected values are:

a)    1 – Parent to Child

b)    2 – Child to Parent

c)    3 – Objects are at equal level

7. integrity_kind: It specifies the type of referential integrity used when either of the two related objects has to be deleted. The expected values are:

a)    0 – Any of the two related objects can be deleted.

b)    1 – As long as the relation exists, neither of the related objects can be deleted.

c)    2 – If one of the related objects gets deleted, other one also gets deleted.

      1. relation_name: Specifies a name for the relation
      2. security_type: It indicates the type of security to be used for the relation object. The valid values are:

a)    SYSTEM: If this value is used, then super-user privileges are required for creating, deleting or modifying the relationships pertaining to this dm_relation_type object.

b)    PARENT: In this case, the ACL for the relation is inherited from the parent object in the relation and RELATE permission is required to create, modify, or drop the relation. The exception to this is if the parent object is not a subtype of dm_sysobject, then no security will be enforced.

c)    CHILD: In this case, the ACL for the relation is inherited from the child object in the relation and RELATE permission is required to create, modify, or drop the relation. The exception to this is if the child object is not a subtype of dm_sysobject, then no security will be enforced.

d)    NONE – In this case, no security is applied. All users can create, modify or delete this kind of relationship.

The dm_relation object has following attributes:

  1. child_id: It is the r_object_id or i_chronicle_id of the child object in this relation. If i_chronicle_id is used, then ‘child label’ attribute can be used to bind the parent object to a particular version of child.
  2. parent_id: The r_object_id or i_chronicle_id of the parent object in the relations. If the attribute ‘permanent_link’ is set to TRUE then only use the i_chronicle_id of the object.
  3. permanent_link: If every new version of the parent object has to be related with the child object, then the value for this attribute must be set to TRUE and i_chronicle_id should be used in the parent_id attribute. By default the value is FALSE.
  4. relation_name: It specifies the value of relation_name attribute of the dm_relation_type object that defines the type of relationship.
  5. child_label <Optional>: If i_chronicle_id is used in the attribute ‘child_id’, then the label of the version of the child object is to be specified here.
  6. description <Optional>: Specifies the description.
  7. effective_date<Optional>: Not used by the system, a user-defined date. Custom logic could check this date to determine the state of the relationship.
  8. expiration_date<Optional>: Not used by the system, a user-defined date. Custom logic could check this date to determine the state of the relationship.
  9. order_no<Optional>: Not used by the system. Custom logic could use this integer value to order a set of relationships.
  • Creation of dm_relation_type object using DQL: The query used to create a dm_relation_type object is as follows:

create dm_relation_type object

set child_parent_label = ‘<Child to parent label>’,

set parent_child_label = ‘<Parent to child label>’,

set description = ‘<Description>’,

set parent_type = ‘<Document type>’,

set child_type = ‘<Document type>’,

set direction_kind = <0 or 1 or 2>,

set integrity_kind = <0 or 1 or 2>,

set relation_name = ‘<Name of Relation>’,

set security_type = ‘<SYSTEM or PARENT or CHILD or NONE>’

  • Creation of dm_relation object using DFC: Following methods which returns an object of type dm_relation can be used to create a dm_relation object.

1. addChildRelative(relationTypeName, childId, childLabel, isPermanent, description): This method has to be invoked on the object which is going to act as Parent in the relation. The parameters it takes are

a)    relationTypeName – Name of a valid dm_relation_type object.

b)    childId – The r_object_id or i_chronicle_id of the child object of the relation.

c)    childLabel – Version label of the child object. If this is ‘null’, the relation will contain no child label.

d)    isPermanent – Specifies if the link permanent. Valid values are TRUE or FALSE

e)    description – Specifies the description for the relation object. If ‘null’, the relation object will not have a description.

2.addParentRelative(relationTypeName, parentId, childLabel, isPermanent, description): This method has to be invoked on the object which is going to act as Child in the relation. It takes the same parameters as the method addChildRelative takes except instead of r_object_id or i_chronicle_id of the child object we pass the r_object_id or i_chronicle_id of the parent object as the parameter parentId.

Note: dm_relation object can be created through DQL also.

Overview on virtual document:

Virtual document provides a way for combining documents in various formats into one consolidated document. For e.g. one word document, one pdf document and one image can be combined to form one virtual document. There is no limitation regarding nesting of documents. One particular version or else all of the versions of a component can be combined with a virtual document. Two object types are used to store information about virtual documents. They are:

  1. Containment Object Type: It stores the information that links a component to a virtual document. Every time a component is added to a virtual document, a containment object is created for that component. The attributes of this object type can be set by the methods AppendPart, InsertPart, UpdatePart.
  2. Assembly Object Type:  An assembly object provides a snapshot of a virtual document at a given instance.

Creation of a virtual document using DFC:

A document can be converted to a virtual document by invoking the method setIsVirtualDocument on it. This method sets the r_is_virtual_doc of the document.

Note: Virtual documents can also be created using clients such as Webtop, DA etc as well as through DQL.

Requirement in the project:

Consider there are three main documents A, B and C. Essentially, A, B and C represent different contracts. Consider another set of documents A1, A2, A3, B1, B2, B3, C1, C2 and C3. A1, A2 and A3 are directly related to A, B1, B2 and B2 are directly related to B, C1, C2 and C3 are directly related to C. Also, A is related to B (A is child, B is parent), B is related to C (B is child and C is parent), and C is related to A (C is child and A is parent). The documents being referred here are Documentum documents of a certain system-defined or user-defined document type.

As per the requirements:

  1. For every new version of the documents the existing relations should be valid.
  2. From the document A, we should be able to navigate to A1, A2 and A3 and also to the documents B and C. Similarly for B and C.
  3. Depending on a particular attribute of main documents (A, B and C), there should be dynamic creation or deletion of relationships between contracts.

Resolution of requirement # 1:

Issue encountered:

Documentation says that when a dm_relation object is created with attribute values permanent_link = TRUE and child_label = ‘Current’ then for every new version of parent or child a new instance of the dm_relation object is created and the relation is created between the latest versions of child object and parent object. But on implementation of the same, always the latest version of the parent object was related to child object between which the relation was initially created.

Issue resolution:

To maintain the relation across the current versions of document and for easy navigation from parent to child documents the concept of virtual documents in addition to that of relationship was used.

All the main documents A, B and C were converted to virtual documents. The child documents A1, A2 and A3 were added as children to the newly converted virtual document A. For this the following DFC methods were used in the same order as specified:

  1. asVirtualDocument(lateBindingValue, followRootAssembly): This method is invoked on a virtual document (in this case on A, B and C) and it returns the virtual document representation of the virtual document object on which it is invoked. The parameters it takes are2.   getRootNode(): This method is invoked on the virtual document representation of a virtual document. It returns the root node of the virtual document. The root node is essentially the virtual document which is at the topmost hierarchy of virtual document tree. (In our case A, B and C are root nodes)
    1. lateBindingValue – the version label of the virtual document. To meet our requirement the value should be “Current”.
    2. folowRootassembly: If the value is set to TRUE, the assembly specified by the root node will be used as the virtual document.
    3. getRootNode(): This method is invoked on the virtual document representation of a virtual document. It returns the root node of the virtual document. The root node is essentially the virtual document which is at the topmost hierarchy of virtual document tree. (In our case A, B and C are root nodes
  2. addNode(parentNode, insertAfterNode, objectChronId, binding, followAssembly, overrideLateBindingValue): This method is invoked on the root node of a virtual document. This method adds a new node to the virtual document on which it is invoked. The parameters it takes are:
  • parentNode: The root node.
  • insertAfterNode: A virtual document node that will immediately precede the new node in the virtual document’s hierarchy. If this parameter is null, the new node is placed as the first child of parentNode.
  • objectChronId: i_chronicle_id of the document which is to be added as child to the virtual document. (In or case i_chronicle_id of A1, A2, A3, B1, B2, B3, C1, C2 and C3).
  • binding: The version label of the version of the child document with which we want to bind the child document with the virtual document.
  • followAssembly: It is set to TRUE if the follow_assembly attribute has to be set to TRUE for the component.
  • overrideLateBindingValue: It is set to TRUE if the version label identified in binding is to be used to resolve late-bound descendents of this component.

So using the concept of virtual document always the current versions of document are present in the parent virtual document. Thus the current versions of A and A1, A2, A3 are always related. And since A has been converted to virtual document, we can navigate to A1, A2 and A3 by just clicking on A.

Resolution of requirement # 2:

Now using the concept of relationship, a relation was created between A and B, B and C, A and C each. Thus navigation across the different contracts A, B and C was possible. Pictorially this can be shown as:

Figure-2

In the above figure the arrow denotes the relationship between two documents.

Resolution of requirement #3:

Requirement #3 states that there should be dynamic creation or deletion of relations between main documents depending on a particular attribute, say, attr. The value of attr for, let’s say, document A determines to which document, A will act as child. If the value of attr for document A is changed so as to imply that A and B are no longer related, then the relation object existing between A and B should be destroyed. If the new value points toward new document D, then a relation has to be created between A and D.

So the change in value of that particular attribute needs to be intercepted. The interception can be done as follows:

  1. Write a TBO (Type based object) for the document type to which A belongs to.                

 Note: For details on TBO refer BusinessObjectsDevelopersGuide.pdf provided by Documentum.

  1. In the TBO, override the method setString(attribute name, value of attribute), if the attribute attr is single-valued attribute or appendString(attribute name, value of attribute), if it is multi-valued attribute. These two methods captures all the attributes and their values for a document type. setString captures the single-valued attributes while appendString captures the multi-valued attributes.
  2. In either of the method, compare the old value of attribute attr with the new one. If there’s any change destroy all the existing relations which involves the document A as child. This can be done using the DFC method removeParentRelative(relationTypeName,parentId,childLabel).  Invoke this method on A. The parameters it takes are:Now use the method addParentRelative as explained, to relate document D as parent to document A.
    1. relationTypeName – Name of the relation object.
    2. parentId – r_object_id of parent object.
    3. childLabel – version label of child object.
  3. Every time the value is changed for attribute attr and the document is saved, the corresponding TBO is invoked and the above mentioned methods will be executed. Thus dynamic creation or deletion of relations can be achieved.

Conclusion: Thus the concept of relationships and virtual documents can be used together to relate objects in Documentum using Documentum Foundation Classes (DFC).

Installing Documentum TBO using Composer

May 12, 2011 3 comments

Introduction:

The Business Object Framework (BOF) is a set of functionality that provides ability to hook into any of the methods in the standard DFC object interfaces.

BOF’s are of two kinds:

  1. Type Based Object(TBO)
  2. Service Based Object(SBO)

TBO’s are used when only one object type needs its functionality overridden whereas, we use a SBO when the functionality is overridden for different object types or in case we want to use the functionality in many other TBO’s.

The main use of TBO is the ability it gives a developer to override selected functionality of the DfSysObject or any of its child types.

Module 1: Creating the TBO

Steps to be followed for creating a TBO jar

  1. Create a java class using the same name as the object type.
  2. The java class will have to extend DfDocument and will have to override its 4 functions getVersion, getVendorString (), isCompatible (String s) and supportsFeature (String s)
  3. Then it will have to override the function that the tbo is needed for example doSave () etc.
  4. Compile this class and make a jar of it.

Module 2: Installing the TBO

Steps for installing the TBO in the docbase

  1. Create a new Documentum project in Composer
  2. Right Click on Artifacts and select New>>Other. In the New screen select Documentum Artifact>> Jar
    Definition

 

Click on Next, and in the Artifact name field enter the name of the object type and click finish

 3. Now for configuring the new jardef that we created. Since we have created only an implementation of the TBO, select Type field as Implementation. And  in the jar content browse and choose your jar file that was created in the Module 1.

4. Right Click on Artifacts and select New>>Other. In the New screen select Documentum Artifact>> Module

Click on Next, and in the Artifact name field enter the name of the object type and click finish

5. Now for configuring the new module that we created. In the Info tab select TBO from the drop down menu for Type.

6. In the Core JARs tab select Add for Implementation JARs and select the jardef created in previous step. Now click on Select in front of class Name this will display the class details of the jar in the jardef select the appropriate class.

7. Right click on Project and select the “Install Documentum Project” menu item.

Enter the username and password and click on Login button. The user will be authenticated and the Next and Finish buttons will become active. Click Finish to install the TBO into the docbase. This may take some time.

 Now whenever an object of the type for which the TBO was created is invoked or created the TBO will be fired. The TBO can be found at Cabinets>>System>>Modules>>TBO

 Module 3: Re-Installing the TBO

In case the java class has been modified, then the TBO needs to be re-installed. The steps to re-install TBO are as follows

  1. Delete the folder with the TBO name in the location (Cabinets/System/Modules/TBO)
  2. Recreate the jar using the new java file.
  3. In Composer remove the jar file from the jardef and add the new jar file
  4. Reinstall the Project to the docbase.

Documentum Traces

November 4, 2010 Comments off
Tracing is one of the easiest and excellent way to troubleshoot complex issues in Documentum.
There are 5 Types of Traces i can think of :
  • DMCl Trace
  • SQL Trace
  • Method Server Trace
  • DFC Trace
  • Authentication trace

1. DFC trace

Description:

This is to trace all the requests that use the Documentum foundation classes.

Steps:

To do this you need to add the following flags to the %DOCUMENTUM%\config\dfc.properties file:

#
# Specifies whether to combine the trace of dmcl along with other traces.

#
dfc.tracing.combineDMCL=true

#
# Specifies whether to enable or disable trace.
#
dfc.tracing.enabled=on

Once you save the file the tracing starts.
You don’t require restarting the Application Server but it can create extremely large logs. Because of this reason it is best if you enable the tracing just before you are ready to replicate the problem and disable it just after you finish replicating the problem.
To stop the tracing change the flag value to false:

#
# Specifies whether to enable or disable trace.
#
dfc.tracing.enabled=false

Remember you need to save the file for the change to take effect.

The information is saved in the %DOCUMENTUM%\logs\trace.log file unless you have modified the default configuration in the %DOCUMENTUM%\config\log4j.properties file if you are not sure you can check this flag:

log4j.appender.FILE_TRACE.File=C\:/Documentum/logs/trace. log

2. DMCL Trace

Description:

This is to trace all the requests that go from the Documentum clients to the content server.

Steps:

Go to the dmcl.ini file and add the following two lines

trace_file=<specify the location where you want the file to be created>

trace_level=<1-10>

3. SQL trace

Description:

This is to trace all the requests that go from the Content server to the database and it is logged automatically to the docbase log.

Steps:

This is done at the content server box

Go to the Documentum Server Manager and select the docbase on which you want to enable the trace.

Go to edit service and add the following line at the end

-osqltrace

Trace is generated on the docbase logs which can be found in

DM_HOME/dba/logs

 

4. Method server trace

Description:

This is to trace all the requests that go to the Method server.

This is done at the content server box

Go to the Documentum Server Manager and select the docbase on which you want to enable the trace.

Go to edit service and add the following line at the end

-otrace_method_server

Logs are in %DOCUMENTUM%\dba\log\<repository_id>\MethodServer\MethodServer\server_config_name.log

 

5 .Authentication Trace

Description:

This is to trace the all the authentication mechanisms in a docbase.

This is done at the content server box.

Go to the Documentum Server Manager and select the docbase on which you want to enable the trace.

Go to edit service and add the following line at the end

-oauthentication_trace

Trace is generated on the docbase logs which can be found in

DM_HOME/dba/logs

Appreciate any comments.

Integrated Document Management (IDM) environment

November 2, 2010 Comments off

As posted in my previous blog ,Document management is a system for controlling the capture, cataloguing, storage, retrieval, revision, sharing and reuse, protection, archiving of documents, the Document flow in an IDM environment could be better depicted as shown below :

The above components can be classified into following three key features of IDM :

Document Capturing :
o    Scanning for the hard copies – involves physical scanner and scanning software.
o    Optical Character Recognition (OCR) capability – involves special scanning software and hardware.
o    Image enhancement – part of scanning software.
o    Indexing utility – captures the metadata for the images. The indexing utility could be part of scanning software or it could be a separate system capable of capturing and attaching metadata for the images. Bar-coding technology has emerged as one of the popular approach for representing index data of documents to be scanned.
o    Electronic document indexing – documents which are already in electronic format like E-mail and E-fax (like Right fax) need to be indexed prior to storing them in the document repository. Some new interfaces may need to be built to support this feature (For e.g.; providing user interface for a right fax server to index a right fax )
o    File format converters – if all the scanned/electronic documents need to be stored in a predefined file format, a file format converter may be needed prior to exporting a document to document repository

Document Access and Management:

o    Document repository management – controls organization of documents with in a document repository. This provides features like organizing documents with in different folders, subfolders and version management of documents. These are basic features of any document management systems like Documentum, Filenet, Tower software etc.
o    Security or access control – Most of the document management systems provide good security features for the stored documents. The multiple levels of security can be provided to different user groups. Additional security can be built with in the systems accessing the documents from the document repository.
o    Reporting – Most of the document management systems provide facility to either connect to an external reporting tool or may internally have a reporting feature. The reporting is primarily done on the document attributes (metadata).
o    Document access interface – document management systems usually provide some interfaces or APIs using which documents & document attributes can be retrieved and accessed by the external world. For e.g.; Documentum exposes its document repository to the external systems using  Documentum Foundation Classes (DFC).
o    Archiving & records management – Though records management itself has developed into a different solution space, basic archival and records retention facilities are provided by most of the IDM vendors

Document Workflow management:

Process automation is the main objective of the Workflow tools. When the business process involves handling of documents, document workflow comes into picture.

o    Case Management – The documents are always associated with a case. For eg; a “case” could be a Person or Policy or any other entities.
o    Process Modelling – Involves defining various processes, decision rules that are involved in the workflow of the document. For e.g.; an insurance application document need to be reviewed by the application processing staff before passing it to underwriter’s review. In this case, Application Processing and Underwriting are the processes involved in the workflow.
o    Routing – Based on the decision rules, documents need to be routed to different processes or individuals.

 

Hope this is helpful ,more about IDM can be found googling.

Enabling Logging in DFC Applications

November 1, 2010 Comments off

Log4j is a Java based logging utility primarily used as a debugging tool. It’s an open source project of Apache Software Foundation that provides a reliable, fast and extensible logging library for Java.

Dflogger is a Documentum DFC class (available with version 5.1 and higher) that can be used to enable logging from DFC Applications. This post describes how to configure and use the logging library from DFC.

Enabling Logging:
Let us have a look at how to enable logging for a java application.
Open the $DOCUMENTUM/config/log4j.properties file (created during the DFC install itself). And, enter the following lines of code at the end of the file and save it.
# Enable log messages for a custom class
log4j.logger.com.documentum.custom.webui=DEBUG,WebUI
#WebUI Appender
log4j.appender.WebUI=org.apache.log4j.RollingFileAppender
log4j.appender.WebUI.File=C\:/WebUI.log
log4j.appender.WebUI.MaxFileSize=10MB
log4j.appender.WebUI.layout=org.apache.log4j.PatternLayout
log4j.appender.WebUI.layout.ConversionPattern=%d{HH:mm:ss} %p %c %m %n

Let us have a look at each of the lines that we have appended to the properties file.
Line 1 : log4j.logger.com.documentum.custom.webui=DEBUG,WebUI
This line specifies the following:
a) the package/class being logged,
b) the priority level – 5 levels of priority are defined in log4j (DEBUG,INFO,WARN,ERROR & FATAL). In our example, we have set it to DEBUG resulting in all messages being logged.
c) the name of the Appender. Log4j allows logging requests to print to multiple destinations. In log4j jargon, an output destination is called an appender. Currently, appenders exist for the console, files, GUI components, remote socket servers, JMS, NT Event Loggers, and remote UNIX Syslog daemons. It is also possible to log asynchronously. We have named our appender WebUI. Lines 2-6 are for configuring the appender.
Line 2 : log4j.appender.WebUI=org.apache.log4j.RollingFileAppender
This line specifies the type of appender to use. RollingFileAppender backs up log files once they reach a max size(specified by the next line).

Line 3 : log4j.appender.WebUI.MaxFileSize=10MB
This line specifies the maximum size of the log file. Once the file reaches this size, the appender backs it up and creates a new log file.
Line 4 : log4j.appender.WebUI.File=C\:/WebUI.log
This line specifies the name of the log file to be created as well as its location. In case the location is not specified, then the log file gets created in the location of the executing class.
Line 5 : log4j.appender.WebUI.layout=org.apache.log4j.PatternLayout
This line specifies that a pattern layout is being used
Line 6 : log4j.appender.WebUI.layout.ConversionPattern=%d{HH:mm:ss} %p %c %m %n
This line specifies the interpretation of the pattern layout. The conversion pattern used here is as follows :
%d{ HH:mm:ss}–date and time of logging, %p- priority level of the message, %c- name of the category (i.e the class/package name),%m- the message, %n- newline character.
Others that could be used are: %t – name of current thread, %F – java source filename, %C –java class name, %M – java method name, %L – java source line number

Now, execution of any of the classes in the webui package will result in the creation of a file called WebUI.log in the specified location (see the entry ‘log4j.appender.WebUI.File’). This is the file into which all the log statements would be written to. But, the log statements cannot come out of thin air, we need to insert a call to DfLogger in our DFC code.
Using DfLogger
To log a message in the log file, we need to insert a call to the DfLogger class in our DFC code. The signature of the method is as follows:
DfLogger.debug(Object arg0, String arg1, String[] arg2, Throwable arg3)
Object arg0 : specifies the class for which the debug message is being logged. Usually, the DfLogger class is called from the class for which we log the message and hence we use the ‘this’ keyword. It refers to the current class instance.
String arg1 : the message to log
String[] arg2 : parameters to use when formatting message
Throwable arg3 : a throwable to log ,can be used to print a stack trace to the log file in case of an exception.
Ex: DfLogger.debug(this,”The Debug Message”,null,dfe);
If we need to print the stack trace of any exception in the class, we need to pass the instance of the Throwable class as the fourth argument.

Ex: catch(DfException dfe){
DfLogger.debug(this,”The Debug Message”,null,dfe);
}
The Outcome
Say, these are DfLogger class calls in our custom code :

DfLogger.error, DfLogger.fatal, DfLogger.info, DfLogger.warn are the method calls to log messages in that priority.
Then, when the user clicks on the link called search, the message gets logged in the log file as follows :

Note the conversion pattern. Also, since we have given the priority level as DEBUG in the properties file, it’s printing out all the log messages i.e other than DEBUG ones. Had we given FATAL (which is the highest level), then only the message for the call DfLogger.fatal(..) would have got logged.

Just a small configurational change and this could go a long way in helping a developer trace the root cause of any issue, understand the application, track the control flow in the application etc. But, one thing needs to be taken care, increasing the number of log messages in an application could bring down its performance.

DBOF Configurations

May 5, 2010 Comments off

We Know DBOF lives wherever DFC lives, either on the Content Server or Application Server, or on any of the other clients. The following Steps need to be tracked when we are trying to configure DBOF :

1) Configure DMCL.ini to both access DocBase and enable session pooling. Add the following entry to the [DMAPI_CONFIGURATION] section :

connect_pooling_enabled = T

When installing Documentum, a file with the name DMCL.ini is created in the folder

‘:/SYSROOT\system32’.

Example: C:/SYSROOT/system32.

2) Make sure dfc.jar is referenced by the CLASSPATH environment variable of your JVM running the application, dfc.jar is in ‘Shared’ folder under Documentum installation folder.

Example: C:/Program Files/Documentum/Shared

3) Make sure dmcl40.dll is in a directory referenced by the PATH variable. By default the file is placed in folder C:/Program Files/Documentum/Shared.

4) The DFC_DATA system variable must refer to the directory containing Documentum configuration information, usually C:\Documentum. This needs to be a directory that has an immediate subdirectory called C:\Documentum\config. This is where the dbor.properties file is also located.

5) Make sure TBO classes and SBO classes are configured in dbor.properties file on
Application Server.

6) The dbor.properties is editable file. As part of best practices the file should be edited by a program using IDfDbor and IDfDborEntry interfaces provided in DFC classes.The entries in dbor.properties file will
look like:

doc_request=type, com.bp.customObj.DocumentRequestTBO, 1.0
com.bp.customObj.ICreateFolderSBO=service,com.bp.customObj.CreateFolderSBO,1.0

DocumentRequestTBO, ICreateFolderSBO and CreateFolderSBO are the java classes
created for providing our own business logic.doc_request is the custom object type defined in DocBase. One can create custom object types using Documentum DA. ‘type’ represents Type Based Object followed by full class name, 1.0 represents the version number of the class to be used.

The second line represents the mapping of SBO, as the identifier ‘service’ mentions.

7) Make sure TBO and SBO compiled classes are accessible from class path variable on
Application server.

I will try to bring in how to implement the concept of BOF in Documentum in my coming Posts.

Appreciate your comments.

Categories: DOCUMENTUM Tags: , , , , , , ,

Handling LifeCycles in DCTM using DFC

April 27, 2010 Comments off

We know how Documentum helps attach a LifeCycle to a document and How to handle its states viz.How to Promote & Demote a Lifecycle state .Let’s try to bring in the same using DFC Concepts.

Attach a LifeCycle to a document in docbase:

public void attachLifeCycle() throws Exception{
IDfSysObject sysObj = (IDfSysObject) idfSession.
getObjectByQualification(“dm_document where
object_name= ‘UniqueDocName’ “);
IDfSysObject procObj = (IDfSysObject)
idfSession.getObjectByQualification(“dm_policy where object_name=
‘LifeCycle Name’ “);
sysObj.attachPolicy(procObj.getObjectId(), “Author”,””);
}

Promote a LifeCycle state of a document inside a docbase:

public void promoteLifeCycle (String State) throws Exception {
IDfSysObject sysObj = (IDfSysObject) idfSession.
getObjectByQualification(“dm_document where
object_name = ‘UniqueDocName’ “);
sysObj.promote(State,false,false);
}

Demote a LifeCycle state of a document inside a docbase:

public void demoteLifeCycle(String state) throws Exception {
IDfSysObject sysObj = (IDfSysObject) idfSession.
getObjectByQualification(“dm_document where
object_name = ‘UniqueDocName’ “);
sysObj.demote(state,false);
}

How to suspend or resume a LifeCycle of a document in Documentum ?

Expire a LifeCycle state of a document in a docbase:

public void expireLifeCycle (String state) throws Exception {
IDfSysObject sysObj = (IDfSysObject) idfSession.
getObjectByQualification(“dm_document where
object_name=’UniqueDocName’ “);
sysObj.suspend(state,false,false);
}

Resume a LifeCycle state of a document inside docbase:

public void resumeLifeCycle (String state) throws Exception {
IDfSysObject sysObj = (IDfSysObject) idfSession.
getObjectByQualification(“dm_document where
object_name=’UniqueDocName’ “);
sysObj.resume(state,false,false,false);
}

Your valuable comments matters , i will work on the same if any tuning to the above piece of snippets is required.

Thank You

%d bloggers like this: