top-image

LATEST ARTICLES

Hello,

After, the posts Documentum : Moving Content Files via Administration Method MIGRATE_CONTENT (1/3) and Documentum : Moving Content Files via Records Migration Job (2/3) concerning respectively the content’s migration via MIGRATE_CONTENT administration method and via Records Migration Job, I propose to focus on content’s migration via Migration Policy.
 
Reminder, Content Server supports three ways to move content files:

  • 1. MIGRATE_CONTENT administration method
  • 2. Records migration job
  • 3. Migration Policy (need of Documentum Content Storage Services license)

 
 
 

3. MIGRATION POLICY (JOB)
 

The content migration policies needs Content Storage Services license and are implemented as jobs. These jobs automate the movement of content files from one storage area to another by:

  • specify the target storage area which wil store the selected content files
  • specify the selection criteria for the content to be moved
  • specify how many content files you want to move in one execution
  • specify the maximum number of files to move in one execution

 
 

Content Storage Services / Centera
In addition to the standard services described above, Content Server also offers the following optionally licensed functionalities. It is possible to add an optional storage policy engine via Content Storage Services (CSS) to automate storage allocation and migration based on policies. For example, frequently accessed content can be stored in a high-performance storage environment while rarely accessed content can be migrated to a more economical storage environment based on configured policies.

  • Content Services for EMC Centera (CSEC) adds support for Centera storage for guaranteed retention and immutability. Centera storage is suitable for storing large amounts of infrequently changing data that needs to be retained for a specific period.
  • Content Storage Services (CSS) enables the use of content storage and migration policies, which automate the assignment of content to various storage areas. CSS can be used for optimizing the use of storage infrastructure in the enterprise. CSS also provides features for content compression and de-duplication.
    In DA, under “Store Management / Migration Policies”:

    Store Management:
    + Storage
    + Assignment Policies
    + Migration Policies

 
 
The procedure of creation is detailed from page 322 in the “Administrator Version 7.2 User Guide” http://www.emc.com/collateral/TechnicalDocument/docu57878.pdf, however, you would find below several important points:

  • The selection of objects for migration is possible via:
    • the selection criteria in the rules section of migration policy job to migrate documents from one store to another,
    • OR selection of migrating objects older than months based on creation date, modify date or access date of the document,
    • OR specification of a DQL predicate that could be a valid where clause to migrate specific objects:

      DQL query selection / Move content objects only(dmr_content)
      WHERE : any parent_id IN (SELECT r_object_id from my_document (ALL) WHERE any r_aspect_name = 'my_aspect_java') AND storage_id NOT IN (select r_object_id from dm_ca_store where name='centera_store_no_retention')

     

  • The migration policy job does not update the r_modify_date and r_modifier fields of target objects type with the dm_query based on dm_document
     
  • The migration policy job moves also the renditions of content.
     
  • Migration policies are jobs that execute the MIGRATE_CONTENT administration method. The Records policy jobs use the system Administration method dm_MoveContent.
     

Here, an example of execution report available in :

SELECT * FROM dm_sysobject WHERE folder('/Temp/Jobs/My_ArchivingMigrationPolicy') ORDER BY r_creation_date DESC;

Connected To MY_DOCBASE_DEV.MY_DOCBASE_DEV
My_ArchivingMigrationPolicy Tool In Progress at 11/01/2016 09:54:00
My_ArchivingMigrationPolicy Tool Completed at 11/01/2016 09:54:20.  Total duration was 0 minutes.
Calling SetJobStatus function...

 
--- Start d:\Documentum\dba\log\xxxxxx\sysadmin\My_ArchivingMigrationPolicyDoc.txt report output ----
My_ArchivingMigrationPolicy Report For DocBase MY_DOCBASE_DEV As Of 11/01/2016 09:54:02
   
 
Parameters for MoveContent:
-----------------------------
-custom_predicate is set to  any parent_id IN (SELECT r_object_id from my_document WHERE any r_aspect_name = 'my_aspect_java') 
-target_store is set to centera_store_no_retention
-target_object_type is set to dmr_content
-remove_original_content is set by the job/method to T
-max_migrate_count is set to 0
-batch_size is set to 500
-renditions is not set.  The system will default to
-parallel_degree is set to 0
-Trace Level is set to 0
-Inbox messages will be queued to MYUSEROWNER


Number of content objects migrated successfully
-----------------------------------------------
2


----- Start content migration log ------------------------

2016-11-01T09:54:02.625000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "Begin Content Migration."
2016-11-01T09:54:02.625000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "TARGET_STORE : centera_store_no_retention, SOURCE_STORE : <Not Specified>, OBJECT_ID : <Not Specified>, QUERY : any parent_id IN (SELECT r_object_id from my_document WHERE any r_aspect_name = 'my_aspect_java'), SYSOBJECT_QUERY : FALSE, MAX_MIGRATE_COUNT : 0, BATCH_SIZE : 500, LOG_FILE : d:\Documentum\dba\log\080220c58052dca7_move_content.log, REMOVE_ORIGINAL : TRUE, RENDITIONS : Primary, PARALLEL_DEGREE : 0, SOURCE_DIRECT_ACCESS : FALSE, TYPE_TO_QUERY : 1, JOB_NAME : <Not Specified>, ALL_VERSIONS : FALSE, IGNORE_CONTENT_VALIDATION_FAILURE : FALSE"
2016-11-01T09:54:02.625000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "select r_object_id from dmr_content where any parent_id IN (SELECT r_object_id from my_document WHERE any r_aspect_name = 'my_aspect_java') AND ANY (parent_id in (select r_object_id from dm_sysobject where r_lock_owner != ' '))"
2016-11-01T09:54:18.132000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "query execution complete"
2016-11-01T09:54:18.132000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "SELECT r_object_id FROM dmr_content WHERE any parent_id IN (SELECT r_object_id from my_document WHERE any r_aspect_name = 'my_aspect_java') ORDER BY r_object_id ASC"
2016-11-01T09:54:18.600000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "query execution complete"

2016-11-01T09:54:18.678000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT_MIGRATING_OBJECT]info:  "Migrating content object 060xxxxxxxxxxxxx4"
2016-11-01T09:54:18.709000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "Executing query : SELECT ret.aging_method, max(ret.retention_date) AS max_ret_date,                                              min(ret.retention_date) AS has_null_date FROM                                              dmr_content_r cont, dm_sysobject_r sys, dm_retainer_s ret                                              WHERE cont.r_object_id = :p0 AND cont.parent_id = sys.r_object_id                                              AND sys.i_retainer_id = ret.r_object_id AND                                              ret.retention_date >= :p1                                              AND ret.r_retention_status=0 group by ret.aging_method"
2016-11-01T09:54:18.741000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "Time taken to execute query (secs) : 0.039108"
2016-11-01T09:54:18.741000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "dmProcessTargetRetentionForMigration, retention(secs) : -1 enableEBR :0 Retention Hold : 0"
2016-11-01T09:54:19.692000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "Updated the a_storage_type attribute(s) of 090220c5803b4df8 to centera_store_no_retention"
2016-11-01T09:54:19.739000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "select r_object_type from dm_sysobject (all) where r_object_id = '090220c5803b4df8'"
2016-11-01T09:54:19.770000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "query execution complete"
2016-11-01T09:54:19.786000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT_PERF_REC]info:  "Total Time (secs)	Total Storage Time (secs)	Total Database Time (secs)	Storage Time for current file (secs)	Database Time for current file (secs)	Total Objects	Current file size	Xput (bytes/sec) for current file	Max storage time (secs)	Max database time (secs)	Max file size	Migration Xput (docs/sec)	Storage Xput (bytes/sec)	Total KBytes"
2016-11-01T09:54:19.786000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT_PERF_REC]info:  "34.116	18.049	16.067	18.049	16.067	1	9392	275.016048774768	18.049	16.067	9392	0.0013117598780631	520.361238849798	9"
2016-11-01T09:54:19.786000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT_MIGRATED_OBJECT]info:  "Migrated content object 060xxxxxxxxxxxxx4"

2016-11-01T09:54:19.848000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT_MIGRATING_OBJECT]info:  "Migrating content object 060xxxxxxxxxxxxa2"
2016-11-01T09:54:19.880000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "Executing query : SELECT ret.aging_method, max(ret.retention_date) AS max_ret_date,                                              min(ret.retention_date) AS has_null_date FROM                                              dmr_content_r cont, dm_sysobject_r sys, dm_retainer_s ret                                              WHERE cont.r_object_id = :p0 AND cont.parent_id = sys.r_object_id                                              AND sys.i_retainer_id = ret.r_object_id AND                                              ret.retention_date >= :p1                                              AND ret.r_retention_status=0 group by ret.aging_method"
2016-11-01T09:54:19.880000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "Time taken to execute query (secs) : 0.0014391"
2016-11-01T09:54:19.880000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "dmProcessTargetRetentionForMigration, retention(secs) : -1 enableEBR :0 Retention Hold : 0"
2016-11-01T09:54:20.301000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "Updated the a_storage_type attribute(s) of 090220c5803c7590 to centera_store_no_retention"
2016-11-01T09:54:20.301000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "select r_object_type from dm_sysobject (all) where r_object_id = '090220c5803c7590'"
2016-11-01T09:54:20.301000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "query execution complete"
2016-11-01T09:54:20.301000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT_PERF_REC]info:  "36.05	19.983	16.067	1.934	0	2	81735	42262.1509824199	18.049	16.067	81735	0.0554785020804438	4560.22619226342	88"
2016-11-01T09:54:20.301000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT_MIGRATED_OBJECT]info:  "Migrated content object 060xxxxxxxxxxxxa2"
2016-11-01T09:54:20.332000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT_MIGRATION_SUCCESS]info:  "Successful migration of batch (2 object(s))."
2016-11-01T09:54:20.332000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT_OBJECTS_MIGRATED]info:  "2 object(s) successfully migrated."
2016-11-01T09:54:20.332000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT_TOTAL_SKIPPED]info:  "0 objects skipped during migration"
2016-11-01T09:54:20.332000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT_PERF_FINAL]info:  "2 Objects Migrated. Total Storage Time(secs): 19.983, Total Database Time: 16.067, Total Content Size (KBytes): 88"
2016-11-01T09:54:20.332000	3600[6572]	01xxxxxxxxxxxxxxxxxx93	[DM_CONTENT_T_MIGRATE_CONTENT]info:  "End Content Migration."

----- End content migration log ------------------------

Report End  11/01/2016 09:54:20
--- End d:\Documentum\dba\log\xxxxxx\sysadmin\My_ArchivingMigrationPolicyDoc.txt report output ---


Best regards,

Huseyin

Hello,

After, the post Documentum : Moving Content Files via Administration Method MIGRATE_CONTENT (1/3) concerning the content’s migration via MIGRATE_CONTENT administration method, I propose to focus on content’s migration via Records Migration Job.
 
Reminder, Content Server supports three ways to move content files:

  • 1. MIGRATE_CONTENT administration method
  • 2. Records migration job
  • 3. Migration Policy (need of Documentum Content Storage Services license)

 
 
 

2. RECORDS MIGRATION JOB
 

A records migration job moves batches of content files based on criteria you define. Records migration jobs are an alternate way to move a large number of content files if you do not want to execute MIGRATE_CONTENT manually or if you do not have a Content Storage Services license.
 
In DA, under “Job Management / Jobs / Records Migration Job:

Job Management:
+ Jobs
+ Methods
+ Administration Methods

 

Records migration jobs move content files from one storage area to another. The target storage area can be another file store storage area or a secondary storage medium, such as an optical jukebox or a tape. If the target storage area is secondary storage, the storage must be defined in the repository as a storage area. That is, it must be represented in the repository by some type of storage object.
 
When you define the records migration job, you can define parameters for selecting the files that are moved. For example, you might want to move all documents that carry a particular version label or all documents created before a particular date. All the parameters you define are connected with an AND to build the query that selects the content files to move.
When a records migration job runs, it generates a report that lists the criteria selected for the job, the query built from the criteria, and the files selected for moving. You can execute the job in report-only mode, so that the report is created but the files are not actually moved.
 
You must have superuser privileges to create a records migration job.

 
 
The procedure of creation is detailed from page 215 in the “Administrator Version 7.2 User Guide” http://www.emc.com/collateral/TechnicalDocument/docu57878.pdf, however, you would find below several important points:

  • The selection of objects for migration is possible via Selection by criteria OR Selection via a query stored in dm_query
     
  • The selection of objects (documents) by criteria is possible ONLY by ONE UNIQUE DCTM-system attribute of objects. Example:

    r_aspect_name IS 'my_aspect_java'

    So, the selection of object by attribute is not possible:

    • by attribute of attached ASPECT
    • by custom attribute,
    • by attribute in register table.

     

  • The selection of objects (documents) by query stored in dm_query. The dm_query type could be created by DCTM API:

    create,c,dm_query
    set,c,l,object_name
    myqueryhuo
    setfile,c,l,C:\temp\mig_query_huo.txt,crtext
    save,c,l

    The file mig_query_huo.txt must be on DEVELOPER PC and has to contain the request:

    SELECT doc.object_name, doc.r_object_id, doc.r_object_type FROM dm_document doc WHERE any doc.r_aspect_name = 'my_aspect_java';

    The request must return dm_sysobject of sub-type of dm_sysobject (containing the r_object_type attribute). Otherwise the method adds automatically a filter in WHERE instruction “r_object_type <> ‘dm_plugin'” and we obtain the following ERROR :

    - Query Used: select r_object_id, parent_id, rendition, parent_count, content_size, full_format, format, page, page_modifier, storage_id from dmr_content where any parent_id IN (SELECT r_object_id from dm_document WHERE any r_aspect_name = 'my_aspect_java';
    Error: ExecQuery Failed for Main Query: select r_object_id, parent_id, rendition, parent_count, content_size, full_format, format, page, page_modifier, storage_id from dmr_content where r_object_type <> 'dm_plugin' AND any parent_id IN (SELECT r_object_id from dm_document WHERE any r_aspect_name = 'my_aspect_java'); in routine ExecShmeMethod
    [DM_QUERY_E_NOT_ATTRIBUTE]error: "You have specified an invalid attribute name (r_object_type)."

     
  • The migration job updates the r_modify_date and r_modifier fields of target objects type with the dm_query based on dm_document
     
  • It is possible to designate and execute the job as a test only by setting job’s configuration. After running the job, the job’s report will the above precision:

    NOTE: This is a report only - no objects will be moved.

     
  • More informations could be specified in the job rules configuration:
    • type of objects : my_huo_document
    • target storage
    • move all versions
    • exclusion of objects that are already migrated
    • formats : Primary format and/or Annotations and/or Renditions
    • definition of version criteria : Affect the current version OR Affect the previous versions

     

  • The Records migration job uses the system Administration method dm_Migration.
     

Here, an example of execution report available in :

SELECT * FROM dm_sysobject WHERE folder('/Temp/Jobs/My_ContentArchivingJob') ORDER BY r_creation_date DESC;

11/01/2016 5:03:57 PM Migration Agent Started
Connected To MY_DOCBASE_DEV.MY_DOCBASE_DEV
DMconnected to docbase MY_DOCBASE_DEV.MY_DOCBASE_DEV as mydctmuser
My_ContentArchivingJob Tool Completed at 11/01/2016 17:04:17.  Total duration was 0 minutes.
Calling SetJobStatus function...

 
--- Start d:\Documentum\dba\log\xxxxx\sysadmin\My_ContentArchivingJobDoc.txt report output ----
My_ContentArchivingJob Report For DocBase MY_DOCBASE_DEV As Of 11/01/2016 17:04:02
   
Summary of Rule 'My_ContentArchivingJob'
-----------------------------------------------------------------------
- Objects of this type (and it's subtypes) will be moved:  my_document
- Objects will be moved to filestore:                      centera_store_no_retention
- Object Selection:
- Objects are being selected by Query Object:              myqueryhuo
- With Each Selected Object: 
- The primary format will be moved.                        PRIMARY FORMAT
- Renditions will be moved.                                RENDITIONS
- Annotations will be moved.                               ANNOTATIONS
- The current version will be moved.                       CURRENT VERSION
- Previous Versions will NOT be moved.
- Only Including the Root Node of a Virtual Document
- Ignoring documents on filestores marked as 'Secondary'
- Number of filestores considered Secondary Storage:       0
 
- Query Used: SELECT doc.object_name, doc.r_object_id, doc.r_object_type FROM my_document doc WHERE any doc.r_aspect_name = 'my_aspect_java';
 
Object Name             Owner Name          Component Moved
-----------             ----------          ----------------------------
MY DOC1                 mydctmuser              Primary Format Current Version
    -                        -                   -
MY DOC2                 mydctmuser              Primary Format Current Version
    -                        -                   -
 
Report End  11/01/2016 17:04:17
--- End d:\Documentum\dba\log\xxxxx\sysadmin\My_ContentArchivingJobDoc.txt report output ---
Successful Execution of Method
Migration Agent Disconnecting...
Migration Agent Completed at 11/01/2016 5:04:18 PM

Best regards,

Huseyin

Hi,

Just a post in order to expose a solution in order to force the starting of a job via DQL:

UPDATE dm_job OBJECTS set run_now=true, set a_next_invocation=DATE(NOW) WHERE object_name = '[job_name]';

You could check the log and the automatically setting to initial value for the next invocation date by AGENT_EXEC process depending on the scheduling parameters of job :

SELECT run_now, a_next_invocation from dm_job WHERE object_name = '[job_name]';
## LOGS
SELECT * FROM dm_sysobject WHERE folder('/Temp/Jobs/[job_name]') ORDER BY r_creation_date DESC;

That’s all!!

Huseyin

Hello,

After, my first post Documentum : Administration Methods : Presentation and examples GET_PATH, DO_METHOD, CHECK_SECURITY, GET_FILE_URL concerning the administration methods, I propose to focus on content’s migration, especially via:

  • 1. MIGRATE_CONTENT administration method
  • 2. Records migration job
  • 3. Migration Policy (need of Documentum Content Storage Services license)

 
 
 

1. MIGRATE_CONTENT ADMINISTRATION METHOD
 
The MIGRATE_CONTENT method allows the moving of multiple content files from a file store, ca store, blobstore or distributed store.
 
Examples of use of DCTM API:
Example 1:
The below query moves all objects from filstore_01 to fl1. The log file identifies the messages generated by the method.

API> apply,c,NULL,MIGRATE_CONTENT,SOURCE_STORE,S,filestore_01,TARGET_STORE,S,fl1,
LOG_FILE,S,C:\temp\gen1.log,REMOVE_ORIGINAL,B,T

 
Example 2:
The above query moves all objects that were created after 23.06.2009 from filestore filestore_01 to target store fl1 via a query specified to move objects based on a criteria:

API> apply,c,NULL,MIGRATE_CONTENT,TARGET_STORE,S,fl1,QUERY,S,r_creation_date >
DATE('23.06.200900:00:01'),SYSOBJECT_QUERY,B,T,LOG_FILE,S,C:\temp\gen.log,
REMOVE_ORIGINAL,B,F

 
 
 
Use of migrate_content method:
The method moves content files from one storage area to another and returns a collection with one query result object. The object has one integer property, named result, whose value is the number of contents successfully migrated.

EXECUTE migrate_content [FOR] object_id
WITH target_store='target_storage_name'
[,renditions=value][,remove_original=TRUE|FALSE][,log_file='log_file_path']
[,source_direct_access_ok=T[,direct_copy=T
[,update_only=T,command_file_name='command_file_name']]
|,source_direct_access_ok=T[,direct_move=T
[,update_only=T,command_file_name='command_file_name']]
]

 
 
WARNING:The method operates on dmr_content objects, resetting their storage_id property to the new storage area and resetting the data ticket property. It also updates the i_vstamp property. The i_vstamp property is also incremented for any SysObject that contains the content file as the first primary content (page 0). Similarly, if the content objects are associated with persistently cached objects, the next time the cached objects are accessed by the client, they will be refreshed. So, if you try to save an object associated with dmr_content objects after its migration (in same session), you could obtain the following errors unless you did a docObject.fetch(…) :
DfException:: THREAD: http-0.0.0.0-0.0.0.0-9080-5; MSG: [DM_SYSOBJECT_E_CANT_SAVE]error: “Cannot save 090xxxxxxx15 sysobject.”; ERRORCODE: 100; NEXT: DfException:: THREAD: http-0.0.0.0-0.0.0.0-9080-5; MSG: [DM_SYSOBJECT_E_VERSION_MISMATCH]error: “save of object failed because of version mismatch: old version was 30”; ERRORCODE: 100; NEXT: null

Followed by: DfException:: THREAD: http-0.0.0.0-0.0.0.0-9080-5; MSG: [DM_SYSOBJECT_E_VERSION_MISMATCH]error: “save of object failed because of version mismatch: old version was 30”; ERRORCODE: 100; NEXT: null


Invoking administration methods via DA

To execute an administration method manually, use Documentum Administrator in the node (Administration/Job Management/Administration Methods).


Invoking administration methods via DQL

To execute an administration method in an application, use the DQL EXECUTE statement. You can also use EXECUTE to invoke an administration method through IDQL.

Note : It’s possible to use IDQL32.exe or IAPI32.exe in order to execute DQL or API commands. The scheduling could be done via OS tasks (like Windows Scheduled).
 
Example:

EXECUTE migrate_content FOR '090xxxxxxxxx70' WITH target_store='centera_store_no_retention',renditions='all',remove_original=TRUE;
...outputs:
result=2

 
 
Example of testimony found on internet:
Migration of 8.5 Terabytes of content from one file store to multiple file stores. I expect to be doing this for a while, but it is working fine.
Right now I am 3TB into the process and I have almost filled one of my target filestores.
I run everything in RepoInt running on the Content Server directly. The command (with different filestore and object type names) that is running is:

Execute MIGRATE_CONTENT With
target_store='filestore_new',
max_migrate_count=100000,
parallel_degree=10,
remove_original=True,
sysobject_query=True,
log_file='c:\temp\ContentMoveLog37.tst',
query='r_object_id in (select r_object_id from custom_document_type where a_storage_type = ''filestore_01'')'

As you can see, I am on my 37th batch. I have played with the max_migrate_count to get it to run how long I desire. With my content profile, the 100,000 documents will take me 10 hours, but your mileage will vary. I would migrate 10,000 items first and time it. That 10K will be a longer 10K on average, but it will give you a baseline.
Only run one of these queries at a time. I had some minor issues. I think I could do it safely, but in general, one at a time should allow you to monitor progress quite readily.
In general, I am trying to move static images from my primary filestore to a new filestore that is designed to better handle my static content. This allows my primary content filestore to focus on just the basic Documentum reports and miscellaneous standard content.


Invoking administration methods via API with APPLY instruction

To execute an administration method in an application, use the specific API instruction APPLY. You can also use these instructions through IAPI. The API Apply method is the API equivalent of the DQL EXECUTE statement. The Apply method returns a collection identifier for a collection that contains the results of the specified method.

Note : It’s possible to use IDQL32.exe or IAPI32.exe in order to execute DQL or API commands. The scheduling could be done via OS tasks (like Windows Scheduled).
 
Example:

API: apply,c,content_obj_id,GET_APTH,STORE,S,filestore_id

 
Example:

API> apply,c,090xxxxxxxxx70,MIGRATE_CONTENT,TARGET_STORE,S,
centera_store_no_retention,RENDITIONS,S,all,REMOVE_ORIGINAL,B,T
API> next,c,q0
API> get,c,q0,result
...outputs:
API> apply,c,090xxxxxxxxx70,MIGRATE_CONTENT,TARGET_STORE,S,
centera_store_no_retention,RENDITIONS,S,all,REMOVE_ORIGINAL,B,T
...
q0
API> next,c,q0
...
OK
API> get,c,q0,result
...
2


Invoking administration methods via DFC – getSession().apply(…)

To execute an administration method in an application, use the IDfSession.apply method. For information about using IDfSession.apply to invoke an administration method, refer to the Javadocs.

Note : It’s possible to include this code in a custom job executed by dm_agent_exec.

	public void migrationWithSessionApply(String objectId) throws Exception {
		System.out.println("----- migrationWithSessionApply("+objectId+")");
		IDfCollection dfCollection = null;
		try {
            IDfList args = new DfList();
            IDfList dataType = new DfList();
            IDfList values = new DfList();
            //
            args.appendString("TARGET_STORE");
            dataType.appendString("S");
            values.appendString("centera_store_no_retention");
            //
            args.appendString("RENDITIONS");
            dataType.appendString("S");
            values.appendString("all");
            //
            args.appendString("REMOVE_ORIGINAL");
            dataType.appendString("B");
            values.appendString("T");
            //
            dfCollection = idfSession.apply(objectId, "MIGRATE_CONTENT", args, dataType, values);
			if(dfCollection !=null && dfCollection.next()){
				System.out.println(dfCollection.getString("result"));
			}
		}finally{
			if(dfCollection!=null){
				dfCollection.close();
			}
		}
	}

…outputs:

----- migrationWithSessionApply(090xxxxxxxxx70)
2

Below, an example of BEFORE/AFTER execution on a specific document:

* ----- BEFORE MIGRATION (primary rendition and XML rendition)
* API> getpath,c,090xxxxxxxxx70
* ...
* \\MYSERVER\data\MY_DOCBASE_DEV\content_storage_01\xxxxx\80\22\c9\6e.dxl
*
* API> getpath,c,090xxxxxxxxx70,,xml
* ...
* \\MYSERVER\data\MY_DOCBASE_DEV\content_storage_01\xxxxx\80\22\c9\6f.xml
*
* ----- AFTER MIGRATION (primary rendition and XML rendition)
* API> getpath,c,090xxxxxxxxx70
* ...
* 04C8NCQ77GJ9XXXXXXXXXXXXX3UBMNVECR6QGB
*
* API> getpath,c,090xxxxxxxxx70,,xml
* ...
* FC9VKAJ2UNJ3VXXXXXXXXXXXXXXXXXX1NUAV


Invoking administration methods via DFC – DfAdminCommand.getCommand(…)

There is an alternative to execute an administration method in an application DfAdminCommand.getCommand(…).

Note : It’s possible to include this code in a custom job executed by dm_agent_exec.

	public void migrationWithApplyMigrateContent(String objectId) throws Exception {
		System.out.println("----- migrationWithApplyMigrateContent("+objectId+")");
		IDfCollection dfCollection = null;
		try {
			IDfApplyMigrateContent applyMigrateContent = (IDfApplyMigrateContent) DfAdminCommand.getCommand(IDfAdminCommand.APPLY_MIGRATE_CONTENT);
			applyMigrateContent.setContentId(new DfId(objectId));
			applyMigrateContent.setTargetStore("centera_store_no_retention");
			applyMigrateContent.setString("RENDITIONS", "all");
			applyMigrateContent.setRemoveOriginal(true);
			applyMigrateContent.setLogFile("C:\\temp\\migration_2.log");
			dfCollection = applyMigrateContent.execute(idfSession);
			//dfCollection = applyMigrateContent.execute(idfSession.getSessionManager().newSession("MY_DOCBASE_DEV"));
			if(dfCollection !=null && dfCollection.next()){
				System.out.println(dfCollection.getString("result"));
			}
		}finally{
			if(dfCollection!=null){
				dfCollection.close();
			}
		}
	}

…outputs:

----- migrationWithApplyMigrateContent(090xxxxxxxxx70)
2

Below, an example of BEFORE/AFTER execution on a specific document:

* ----- BEFORE MIGRATION (primary rendition and XML rendition)
* API> getpath,c,090xxxxxxxxx70
* ...
* \\MYSERVER\data\MY_DOCBASE_DEV\content_storage_01\xxxxx\80\22\c9\6e.dxl
*
* API> getpath,c,090xxxxxxxxx70,,xml
* ...
* \\MYSERVER\data\MY_DOCBASE_DEV\content_storage_01\xxxxx\80\22\c9\6f.xml
*
* ----- AFTER MIGRATION (primary rendition and XML rendition)
* API> getpath,c,090xxxxxxxxx70
* ...
* 04C8NCQ77GJ9XXXXXXXXXXXXX3UBMNVECR6QGB
*
* API> getpath,c,090xxxxxxxxx70,,xml
* ...
* FC9VKAJ2UNJ3VXXXXXXXXXXXXXXXXXX1NUAV

Best regards,

Huseyin

Page 1 of 35:1 2 3 4 »Last »
bottom-img
Copyright ® 2012 Huseyin Ozveren. No reproduction, even partial, can be used from this site and all its contents including text, documents, images, etc.. without the express permission of the author.
Jeux gratuit | flash jeux gratuits