Sitemap

Thursday, October 29, 2015

SOAPUI issue when configured with certificates

Comment the following line in <soapUI installation location>/jre/lib/security/java.security File

jdk.certpath.disabledAlgorithms=MD2, RSA keySize < 1024

Monday, October 26, 2015

UCM: GET_SEARCH_RESULTS using RIDC

import java.io.IOException;

import oracle.stellent.ridc.IdcClient;
import oracle.stellent.ridc.IdcClientException;
import oracle.stellent.ridc.IdcClientManager;
import oracle.stellent.ridc.IdcContext;
import oracle.stellent.ridc.model.DataBinder;
import oracle.stellent.ridc.model.DataObject;
import oracle.stellent.ridc.model.DataResultSet;
import oracle.stellent.ridc.model.serialize.HdaBinderSerializer;
import oracle.stellent.ridc.protocol.ServiceResponse;

public class TestRIDCSearch {
    public static void main(String[] args) {
        IdcClientManager manager = new IdcClientManager();
        try {
            IdcClient idcClient = manager.createClient("idc://localhost:4444");
            IdcContext userContext = new IdcContext("sysadmin");
            HdaBinderSerializer serializer = new HdaBinderSerializer("UTF-8", idcClient.getDataFactory());

            DataBinder dataBinder = idcClient.createBinder();
            dataBinder.putLocal("IdcService", "GET_SEARCH_RESULTS");
            
            /*The easiest way to figure out the QueryText expected by the content server is to execute the query using the content server search screen 
            in the native UI and trace that call. Gather full verbose, system audit trace using the following trace sections:
            system, requestaudit, searchquery, searchcache
            This should be the following output:
            >searchquery/6 03.25 14:24:41.285 IdcServer-826 preparedQueryText: ( dDocName `test` dSecurityGroup `Public` dInDate >= `3/25/15 12:00 AM` )*/
            
            /*NOTE: UCM limits the ResultCount value according to the MaxResults configuration setting for Content Server. 
             * To increase the MaxResults setting, add the setting MaxResults=<integer> in config.cfg*/
            
            dataBinder.putLocal("QueryText", "dDocName `test` dSecurityGroup `Public` dInDate >= `3/25/15 12:00 AM`");
            serializer.serializeBinder(System.out, dataBinder);
            ServiceResponse response = idcClient.sendRequest(userContext, dataBinder);

            DataBinder responseData = response.getResponseAsBinder();
            serializer.serializeBinder(System.out, responseData);
            DataResultSet resultSet = responseData.getResultSet("SearchResults");

            for (DataObject dataObject : resultSet.getRows()) {
                System.out.println("Title is: " + dataObject.get("dDocTitle"));
            }
        } catch (IdcClientException ice) {
            ice.printStackTrace();
        } catch (IOException ioe) {
            ioe.printStackTrace();
        }
    }
}


Installation of multiple versions of JDeveloper

Navigate to <Installation_Folder>\jdeveloper\jdev\bin\jdev.boot file.

Locate the ide.user.dir property and change it to something like this:
ide.user.dir= C:/JDev11117SOADefaultDomain

Friday, October 9, 2015

Prerequisites before SOA Development

Creation of JMS Queue
Queue: Defines a point-to-point destination type, which are used for asynchronous peer communications. A message delivered to a queue is distributed to only one consumer.

Click SOAJMSModule.


In the Summary of Resources table, click New. Select the resource type: Queue and click Next.

Enter name: demoFulfillmentQueue and JNDI name: jms/demoFulfillmentQueue

Select Subdeployment SOASubDeployment. Select JMS Server SOAJMSServer

Creation of Connection Factory
Connection Factory: Defines a set of connection configuration parameters that are used to create connections for JMS clients.

Select the resource type: Connection Factory and click Next
Enter name: demoCF and JNDI name: jms/demoCF.

You should see both the queue and the connection factory listed in the Summary of Resources as shown here.

Now add the connection pool. The connection pool is configured in the JMSAdapter application and uses a Deployment Plan. First, create a directory to contain that plan.

Create a directory JMSPlan: C:\Oracle\Oracle_SOA1\soa\JMSPlan

Click Deployments. Click JMS Adapter. Click the Configuration tab and then the Outbound Connection Pools tab.

Click New. Select the factory oracle.tip.adapter.jms.IJmsConnectionFactory. Click Next. Enter eis/Queue/demo.

Click on eis/Queue/demo. Change the property ConnectionFactoryLocation to value jms/demoCF. Now update the JMSAdapter adapter.

=====================================================================

Create a data source with name as soademoDatabase, and JNDI Name as jdbc/soademoDatabase

Create a directory DBPlan: C:\Oracle\Oracle_SOA1\soa\DBPlan

Click Deployments. Click DbAdapter. Click the Configuration tab and then the Outbound Connection Pools tab.

Enter the JNDI Name as follows: eis/DB/soademoDatabase

Edit the connection pool to reference the data source. You have created a connection pool.

Monday, September 21, 2015

SOA Glossary

Service Component Architecture (SCA)
Service Component Architecture (SCA) is a set of specifications that describe a model for building applications using a service-oriented architecture.
1. Services are assembled together to form a composite application that creates a solution that addresses a specific business requirement.
2. Composite applications may contain new services (specifically for the application) and business functions from existing systems and applications (reused in the composite application).


SOA Composite
A SOA composite is an assembly of services, service components, and references designed and deployed together in a single application. Wiring between the service, service component, and reference enables message communication.
  1. Services provide the outside world with an entry point to the SOA composite application. The service advertises its capabilities (also known as operations) to external applications with a WSDL (Web Services Description Language) file. The binding of the service describes the protocols that can communicate with the application. Examples include SOAP/HTTP or a JCA adapter.
  2. Service components are the building blocks of a SOA composite application. Oracle SOA Suite 11g includes the following components:
    - The BPEL Process component enables design and execution of a business process that integrates a series of business activities and services into an end-to-end process flow.
    - The Business Rules component provides the means of making business decisions based on defined rules.
    - The Human Task component allows you to model a workflow that describes tasks for users or groups to perform as part of an end-to-end business process flow.
    - The Mediator component is used for validation, filtering, transformation, and routing of message data between components.
  3. References enable messages to be sent from the SOA composite application to external services in the outside world.

Service Data Object (SDO)

An SDO exposes any data source as a service, which enables retrieval and manipulation of the data in an XML format through service operations. The task of connecting applications to data sources is performed by a data mediator service.
Oracle SOA Suite 11g enables a BPEL Process to access an SDO through an Entity Variable, a special type of BPEL variable associated with a SDO as a service. Oracle ADF-BC components can be deployed simultaneously as Web Service and an SDO.


WSIL (Web Services Inspection Language) Connection
The http://localhost:8001/inspection.wsil URL accesses a Java EE application that dynamically discovers WSDL URL endpoints for Java EE and SOA composite applications deployed to the same run-time server.


Adapters
Adapters provide a service interface that:
• Exposes external application functionality in a form that can be used by SOA composite application components
• Converts request and responses into a form suitable for other (external) systems
• Implements interfaces by using the Java Connector Architecture (JCA) API standards


Oracle Web Service Manager Policy Manager Policy Manager

Oracle WSM  Policy Manager provides the infrastructure for enforcing global security and auditing policies in the service infrastructure. By securing various endpoints and setting and propagating identity, it secures applications. Oracle WSM Policy Manager provides a standard mechanism for signing messages, performing encryption, performing authentication, and providing role-based access control.


The Oracle Metadata Repository

The Oracle SOA Suite 11g runtime environment requires MDS to maintain SOA application configuration and runtime information. It is used to manage deployed services and composite applications.
The MDS can also be used as a central location for storing and referencing shared service artifacts, such as business events, rule sets for Oracle Business Rules, XSLT files for Oracle Mediator, XSD and WSDL documents for Oracle BPEL Process Manager, and other service documents, which can be deployed in a sharable archive format known as the Metadata archive (.mar files).


Business Events and the Event Delivery Network

A business event is a way for one application to notify another application of a significant occurrence to the business. When a business event is published, another application (or service component) can subscribe to it and initiate whatever processing is implied by that event. For example, when product stock levels are updated in an inventory database, an event can serve as a trigger or signal for another process to fulfill orders that have been on hold until products become available.
Business events are typically an asynchronous fire-and-forget (one-way) notification of a business occurrence.
Events are defined by using Event Definition Language (EDL) to specify the name and structure of an event. Definitions for business events are stored in the MDS, and published in the Event Delivery Network (EDN).
The Event Delivery Network is designed to handle asynchronous messaging arising from a business or system event. The EDN is not messaging infrastructure. It provides an application with a declarative publish-subscribe implementation to publish events so that a composite application with a Mediator component can subscribe to events that trigger execution of the composite application.

Wednesday, September 16, 2015

SOAP-based web service in Java

TimeServer.java
package com.sonal;

import javax.jws.WebService;
import javax.jws.WebMethod;
import javax.jws.soap.SOAPBinding;
import javax.jws.soap.SOAPBinding.Style;

/**
 * SEI for a web service that returns the current time as either a string or as
 * the elapsed milliseconds from the Unix epoch, midnight January 1, 1970 GMT.
 * 
 * The annotation @WebService signals that this is the SEI (Service Endpoint
 * Interface). @WebMethod signals that each method is a service operation.
 *
 * The @SOAPBinding annotation impacts the under-the-hood construction of the
 * service contract, the WSDL (Web Services Definition Language) document.
 */
@WebService
@SOAPBinding(style = Style.DOCUMENT)
public interface TimeServer {
 @WebMethod
 String getTimeAsString();

 @WebMethod
 long getTimeAsElapsed();
}


TimeServerImpl.java
package com.sonal;

import java.util.Date;
import javax.jws.WebService;

/**
 * The @WebService property endpointInterface links this SIB (Service
 * Implementation Bean) to the SEI (com.sonal.TimeServer). Note that the method
 * implementations are not annotated as @WebMethods.
 */
@WebService(endpointInterface = "com.sonal.TimeServer")
public class TimeServerImpl implements TimeServer {
 public String getTimeAsString() {
  return new Date().toString();
 }

 public long getTimeAsElapsed() {
  return new Date().getTime();
 }
}


TimeServerPublisher.java
package com.sonal;

import javax.xml.ws.Endpoint;

/**
 * This application publishes the web service whose SIB is
 * com.sonal.TimeServerImpl. For now, the service is published at network address
 * 127.0.0.1., which is localhost, and at port number 9876, as this port is
 * likely available on any desktop machine. The publication path is /ts, an
 * arbitrary name.
 *
 * The Endpoint class has an overloaded publish method. In this two-argument
 * version, the first argument is the publication URL as a string and the second
 * argument is an instance of the service SIB, in this case
 * com.sonal.TimeServerImpl.
 *
 * The application runs indefinitely, awaiting service requests. It needs to be
 * terminated at the command prompt with control-C or the equivalent.
 *
 * Once the application is started, open a browser to the URL
 *
 ** http://127.0.0.1:9876/ts?wsdl
 *
 * to view the service contract, the WSDL document. This is an easy test to
 * determine whether the service has deployed successfully. If the test
 * succeeds, a client then can be executed against the service.
 */

public class TimeServerPublisher {
 public static void main(String[] args) {
  // 1st argument is the publication URL
  // 2nd argument is an SIB instance
  Endpoint.publish("http://127.0.0.1:9876/ts", new TimeServerImpl());
 }
}



WSDL document for the TimeServer service
<?xml version="1.0" encoding="UTF-8"?>
<!-- Published by JAX-WS RI at http://jax-ws.dev.java.net. RI's version is JAX-WS RI 2.2.4-b01. -->
<!-- Generated by JAX-WS RI at http://jax-ws.dev.java.net. RI's version is JAX-WS RI 2.2.4-b01. -->
<definitions xmlns="http://schemas.xmlsoap.org/wsdl/" xmlns:soap="http://schemas.xmlsoap.org/wsdl/soap/" xmlns:tns="http://sonal.com/" xmlns:wsam="http://www.w3.org/2007/05/addressing/metadata" xmlns:wsp="http://www.w3.org/ns/ws-policy" xmlns:wsp1_2="http://schemas.xmlsoap.org/ws/2004/09/policy" xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd" xmlns:xsd="http://www.w3.org/2001/XMLSchema" targetNamespace="http://sonal.com/" name="TimeServerImplService">
   <types>
      <xsd:schema>
         <xsd:import namespace="http://sonal.com/" schemaLocation="http://127.0.0.1:9876/ts?xsd=1" />
      </xsd:schema>
   </types>
   <message name="getTimeAsString">
      <part name="parameters" element="tns:getTimeAsString" />
   </message>
   <message name="getTimeAsStringResponse">
      <part name="parameters" element="tns:getTimeAsStringResponse" />
   </message>
   <message name="getTimeAsElapsed">
      <part name="parameters" element="tns:getTimeAsElapsed" />
   </message>
   <message name="getTimeAsElapsedResponse">
      <part name="parameters" element="tns:getTimeAsElapsedResponse" />
   </message>
   <portType name="TimeServer">
      <operation name="getTimeAsString">
         <input wsam:Action="http://sonal.com/TimeServer/getTimeAsStringRequest" message="tns:getTimeAsString" />
         <output wsam:Action="http://sonal.com/TimeServer/getTimeAsStringResponse" message="tns:getTimeAsStringResponse" />
      </operation>
      <operation name="getTimeAsElapsed">
         <input wsam:Action="http://sonal.com/TimeServer/getTimeAsElapsedRequest" message="tns:getTimeAsElapsed" />
         <output wsam:Action="http://sonal.com/TimeServer/getTimeAsElapsedResponse" message="tns:getTimeAsElapsedResponse" />
      </operation>
   </portType>
   <binding name="TimeServerImplPortBinding" type="tns:TimeServer">
      <soap:binding transport="http://schemas.xmlsoap.org/soap/http" style="document" />
      <operation name="getTimeAsString">
         <soap:operation soapAction="" />
         <input>
            <soap:body use="literal" />
         </input>
         <output>
            <soap:body use="literal" />
         </output>
      </operation>
      <operation name="getTimeAsElapsed">
         <soap:operation soapAction="" />
         <input>
            <soap:body use="literal" />
         </input>
         <output>
            <soap:body use="literal" />
         </output>
      </operation>
   </binding>
   <service name="TimeServerImplService">
      <port name="TimeServerImplPort" binding="tns:TimeServerImplPortBinding">
         <soap:address location="http://127.0.0.1:9876/ts" />
      </port>
   </service>
</definitions>

The portType section, near the top, groups the operations that the web service delivers, in this case the operations getTimeAsString and getTimeAsElapsed, which are the two Java methods declared in the SEI and implemented in the SIB. The WSDL portType is like a Java interface in that the portType presents the service operations abstractly but provides no implementation detail.
The other WSDL section of interest is the last, the service section, and in particular the service location, in this case the URL http://localhost:9876/ts. The URL is called the service endpoint and it informs clients about where the service can be accessed.

TimeClient.java
package com.sonal;

import javax.xml.namespace.QName;
import javax.xml.ws.Service;
import java.net.URL;

class TimeClient {
 public static void main(String args[]) throws Exception {
  URL url = new URL("http://localhost:9876/ts?wsdl");
  // Qualified name of the service:
  // 1st arg is the service URI
  // 2nd is the service name published in the WSDL
  QName qname = new QName("http://sonal.com/", "TimeServerImplService");
  // Create, in effect, a factory for the service.
  Service service = Service.create(url, qname);
  // Extract the endpoint interface, the service "port".
  TimeServer port = service.getPort(TimeServer.class);
  System.out.println(port.getTimeAsString());
  System.out.println(port.getTimeAsElapsed());
 }
}


Thursday, September 10, 2015

Web services using Eclipse

Apache CXF
1. JAX-WS annotation based web service: http://javahonk.com/jax-ws-web-service-eclipse/
2. Generate client class from wsdl: http://javahonk.com/create-jax-wx-web-service-client/

SOAP Web Service and Web Service Client in Eclipse
http://www.simplecodestuffs.com/create-and-deploy-web-service-and-web-service-client-in-eclipse/

Creating a JAX-WS web service, but no container
http://www.simplecodestuffs.com/jax-ws-web-services-using-eclipse/
http://examples.javacodegeeks.com/enterprise-java/jws/jax-ws-hello-world-example-document-style/
http://www.java2blog.com/2013/03/jaxws-web-service-eclipse-tutorial.html
http://java.globinch.com/enterprise-java/web-services/jax-ws/java-jax-ws-tutorial-develop-web-services-clients-consumers/

Create RESTful web services in java(JAX-RS) using jersey
https://afsinka.wordpress.com/2015/12/27/restful-web-service-example-with-jersey-2-and-tomcat-8/
http://javapapers.com/web-service/restful-services-crud-with-java-jax-rs-jersey/
http://www.java2blog.com/2013/04/create-restful-web-servicesjax-rs-using.html

Monday, September 7, 2015

Web Services Description Language (WSDL)


• Types define the data types used in messages. These types are often drawn from the XML Schema Language.
• Messages describe the part(s) of the input, output, and fault messages exchanged with the calling program.
• Operations provide a name for the action performed on messages.
• PortTypes group message(s) with operation(s).

Those WSDL documents for services deployed outside the WebLogic Server will include the concrete elements of the WSDL, which provide additional information about how and where to access the service:
• The binding describes how a given portType operation will be transmitted, such as HTTP or SOAP (that is, the protocol) and information about where the service is located.
• The port specifies a combination of a network address and a binding, which constitute an endpoint.
• A service groups ports together. A service reveals to a calling program where to access the web service, and through which port. It also describes how the communication messages are defined.



Section 1 includes an import statement and a reference to file po.xsd.
Section 2 describes message requestMessage. The description includes the name of the part, and the name of the element that describes the message structure of that part. Message element PurchaseOrder is fully described in the imported po.xsd document.
Finally, section 3 describes port type execute_ptt, which groups a specific operation (execute) with message requestMessage.

The number and order of messages in the portType definition of a WSDL is important. Because we see only a single message listed in the example above, we know that operation execute is a one-way operation. If we saw a second message like in the example below, we could assume that the operation returned a response upon completion.


Thursday, August 27, 2015

UCM: dinDate, dCreateDate, dReleaseDate, and dOutDate


  • The dInDate field shows the date and time on the check-in form.  (The Release Date field on the check-in form uses this value).

  • The dCreateDate field is the date and time the document is actually checked into the system.

  • The dReleaseDate field shows the date and time the document makes it to the Released state (after being converted, indexed, out of workflow, etc.).  This gets modified during a metadata update.

  • The dOutDate field shows the date the document will expire (The Expiration Date on the check-in form uses this value).

NOTE: The counters logic was rewritten in 11g and the RevClasses table was part of the solution. Addition information about dDocCreatedDate can be found in the documentation: (Section 5.1.13 Configuring the dDocCreatedDate Attribute)

Sunday, July 12, 2015

Generating a Java thread dump

On Windows
https://access.redhat.com/solutions/19170

On Linux
https://access.redhat.com/solutions/18178

Wednesday, July 8, 2015

UCM: Standard Services

checkSecurity
• Takes none or one parameter. If a parameter is given, it will be the name of a ResultSet. The method checks if the logged in user has the appropriate security as specified in the access level of the service to perform the specified action. It checks the security against a specific security group and account (if accounts are enabled) as specified by a revision. This method is used for validating security for actions on a particular content item, e.g. check in, check out, delete, etc.

createResultSetSQL
• Takes no parameters. Given a dataSource, whereClause (as set in local data), the method looks up the data source in the DataSources table and executes the query with the additional where clause. The method will append additional security clauses to any query referencing the Workflow and Revisions table. The environment value MaxQueryRows determines the cut off point for the number of rows returned. The results of the query are placed in the data with the name as specified by the resultName (as set in local data).

doSubService
• Takes one parameter. Given the name of a sub service, will execute it.

loadDefaultInfo
• Takes no parameters. The method will first execute the loadDefaultInfo filter. It then loads environment information, types, formats and accounts. Used for creating check-in and update pages.

loadMetaOptionsLists
• Takes no parameters. The method will first execute the filter loadMetaOptionsLists. It then proceeds to load all options list as referred to in the DocMetaDefinition table.

loadSharedTable
• Takes two parameters. The first parameter is the name of the table to look up in the server's cached tables. The second is the name the table will be given when added to the data. Use this method instead of executing a query, when the data is already cached in the server. This method is primarily used to make a server-cached table available for a template.

loadSharedTableRow
• Takes two parameters. The first parameter is the name of the table to look up in the server's cached tables. The second parameter is an argument specifying a column in the database and a lookup key into the request data. The value for the key in the request data is used to find the row in the cached table. The values of the row are mapped to the local data using the names of the columns as keys. One usage for this function is to retrieve cached information about a specific user.

mapResultSet
• Takes at least three parameters. The first parameter is the name of a select query; the parameters that follow must appear in pairs. The first member of the pair is the column name; the second member is the key that is used to put the row value into local data. The method will execute the specified query and map the specified columns of the first row of the ResultSet to the local data. Use this method in replacement of a Type 5 action, if the service only requires a part of the first row of a ResultSet to be stored.

refreshCache
• Given a list of subjects, this method will do a refresh on each specified subject.

renameValues
• Takes multiple parameters that must appear in pairs. The pairs are made up of two keys. The first key is used to look up a value in the data. The second key is used to store the found value in the local data. If the value is not found, an exception is thrown and the service will abort with an error message.

setConditionVars
• Given a list of condition variables, this method will set them all to true. These values can only be tested in HTML template pages. They are not put into local data.

setLocalValues
• Takes multiple parameters that must appear in name/value pairs. The name/value pairs are placed into the local data.

Sunday, July 5, 2015

Java: Links and Resources

http://www.kodejava.org/

http://esus.com/

http://mindprod.com/jgloss/jgloss.html

http://rosettacode.org/wiki/Category:Java



http://www.leveluplunch.com/java/tutorials/
http://www.java-examples.com/
http://www.java2novice.com/java-collections-and-util/hashmap/
http://kodehelp.com/category/javaj2ee/java7api/java-io/

Java RunTime Environment was not found. Oracle Universal Installer cannot be run



I was having some trouble while uninstalling the Oracle_ECM1 from my Windows system. In such a case, we need to use the command prompt and pass -jreLoc parameter as follows:

C:\Users\sonal_chaudhary>cd C:\Oracle\Oracle_ECM1\oui\bin\
C:\Oracle\Oracle_ECM1\oui\bin>setup.exe -deinstall -ignoreSysPrereqs -jreLoc C:\Java\jre7


Thursday, June 25, 2015

Java: Input and Output streams

java.io package provides I/O classes to manipulate streams. This package supports two types of streams:
1. binary streams which handle binary data. InputStream and OutputStream are high level interfaces for manipulating binary streams.
2. character streams which handle character data. Reader and Writer are high level interfaces for manipulating character streams. In this section, the main focus is on binary streams.

By default, most of the streams read or write one byte at a time. This causes poor I/O performance because it takes lot of time to read/write byte by byte when dealing with large amounts of data. I/O provides Buffered streams to override this byte by byte default behaviors.



import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;

public class Main {

    private static final String SOURCE_FILE = "D:\\test.jar";

    public static void main(String[] args) {
        Main io = new Main();
        try {
            long startTime = System.currentTimeMillis();
            io.readWrite(SOURCE_FILE, "D:\\test1.jar");
            long endTime = System.currentTimeMillis();
            System.out.println("Time taken for reading and writing using default behaviour : " + (endTime - startTime) +
                               " milli seconds");

            long startTime1 = System.currentTimeMillis();
            io.readWriteBuffer(SOURCE_FILE, "D:\\test2.jar");
            long endTime1 = System.currentTimeMillis();
            System.out.println("Time taken for reading and writing using buffered streams : " +
                               (endTime1 - startTime1) + " milli seconds");

            long startTime2 = System.currentTimeMillis();
            io.readWriteArray(SOURCE_FILE, "D:\\test3.jar");
            long endTime2 = System.currentTimeMillis();
            System.out.println("Time taken for reading and writing using custom buffering : " +
                               (endTime2 - startTime2) + " milli seconds");
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

    public static void readWrite(String fileFrom, String fileTo) throws IOException {
        InputStream in = null;
        OutputStream out = null;
        try {
            in = new FileInputStream(fileFrom);
            out = new FileOutputStream(fileTo);
            while (true) {
                int bytedata = in.read();
                if (bytedata == -1)
                    break;
                out.write(bytedata);
            }
        } finally {
            if (in != null)
                in.close();
            if (out != null)
                out.close();
        }
    }

    public static void readWriteBuffer(String fileFrom, String fileTo) throws IOException {
        InputStream inBuffer = null;
        OutputStream outBuffer = null;
        try {
            InputStream in = new FileInputStream(fileFrom);
            inBuffer = new BufferedInputStream(in);
            OutputStream out = new FileOutputStream(fileTo);
            outBuffer = new BufferedOutputStream(out);
            while (true) {
                int bytedata = inBuffer.read();
                if (bytedata == -1)
                    break;
                out.write(bytedata);
            }
        } finally {
            if (inBuffer != null)
                inBuffer.close();
            if (outBuffer != null)
                outBuffer.close();
        }
    }

    public static void readWriteArray(String fileFrom, String fileTo) throws IOException {
        InputStream in = null;
        OutputStream out = null;
        try {
            in = new FileInputStream(fileFrom);
            out = new FileOutputStream(fileTo);
            int availableLength = in.available();
            byte[] totalBytes = new byte[availableLength];
            int bytedata = in.read(totalBytes);
            out.write(totalBytes);

        } finally {
            if (in != null)
                in.close();
            if (out != null)
                out.close();
        }
    }
}


OUTPUT
Time taken for reading and writing using default behaviour : 5188 milli seconds
Time taken for reading and writing using buffered streams : 3105 milli seconds
Time taken for reading and writing using custom buffering : 7 milli seconds

Java: Properties Class

A Properties object is a persistent Hashtable that stores key–value pairs of Strings. By "persistent", we mean that the Properties object can be written to an output stream (possibly a file) and read back in through an input stream. A common use of Properties objects in prior versions of Java was to maintain application-configuration data or user preferences for applications.

import java.io.FileOutputStream;
import java.io.FileInputStream;
import java.io.IOException;

import java.util.Properties;
import java.util.Set;

public class PropertiesTest {
    public static void main(String[] args) {
        Properties table = new Properties();

        // set properties
        table.setProperty("color", "blue");
        table.setProperty("width", "200");

        System.out.println("After setting properties");
        listProperties(table);

        // replace property value
        table.setProperty("color", "red");

        System.out.println("After replacing properties");
        listProperties(table);

        saveProperties(table);

        table.clear(); // empty table

        System.out.println("After clearing properties");
        listProperties(table);

        loadProperties(table);

        // get value of property color
        Object value = table.getProperty("color");

        // check if value is in table
        if (value != null)
            System.out.printf("Property color's value is %s%n", value);
        else
            System.out.println("Property color is not in table");
    }

    // save properties to a file

    private static void saveProperties(Properties props) {
        // save contents of table
        try {
            FileOutputStream output = new FileOutputStream("props.dat");
            props.store(output, "Sample Properties"); // save properties
            output.close();
            System.out.println("After saving properties");
            listProperties(props);
        } catch (IOException ioException) {
            ioException.printStackTrace();
        }
    }

    // load properties from a file

    private static void loadProperties(Properties props) {
        // load contents of table
        try {
            FileInputStream input = new FileInputStream("props.dat");
            props.load(input); // load properties
            input.close();
            System.out.println("After loading properties");
            listProperties(props);
        } catch (IOException ioException) {
            ioException.printStackTrace();
        }
    }

    // output property values

    private static void listProperties(Properties props) {
        Set<object> keys = props.keySet(); // get property names

        // output name/value pairs
        for (Object key : keys)
            System.out.printf("%s\t%s%n", key, props.getProperty((String)key));

        System.out.println();
    }
}


OUTPUT
After setting properties
color blue
width 200

After replacing properties
color red
width 200

After saving properties
color red
width 200

After clearing properties

After loading properties
color red
width 200

Property color's value is red


Java: Apache Ant

build.properties
src.dir=src
classes.dir=classes
main-class=com.mypkg.PortfolioManager
lib.dir=lib
docs.dir=docs
projectName=AntTutorial

build.xml
<?xml version="1.0" encoding="windows-1252" ?>
<!--Ant buildfile generated by Oracle JDeveloper-->
<!--Generated Apr 20, 2015 4:09:46 PM-->
<project xmlns="antlib:org.apache.tools.ant" name="Project" default="all" basedir=".">
  <property file="build.properties"/>

  <target name="clean">
    <delete dir="${classes.dir}"/>
    <delete dir="${docs.dir}"/>
  </target>

  <target name="init">
    <mkdir dir="${classes.dir}"/>
    <!--<mkdir dir="${docs.dir}"/>-->
  </target>

  <path id="classpath">
    <fileset dir="${lib.dir}" includes="**/*.jar"/>
  </path>

  <target name="compile" depends="init">
    <javac srcdir="${src.dir}" destdir="${classes.dir}" classpathref="classpath"/>
  </target>

  <!--<target name="docs" depends="compile">
    <javadoc packagenames="src" sourcepath="${src.dir}" destdir="${docs.dir}">
       <fileset dir="${src.dir}">
                <include name="**" />
           </fileset>
    </javadoc>
  </target>-->

  <target name="jar" depends="compile">
    <jar destfile="${projectName}.jar" basedir="${classes.dir}">
      <manifest>
        <attribute name="Main-Class" value="${main-class}"/>
      </manifest>
    </jar>
  </target>

  <target name="main" depends="clean,compile,jar"/>
</project>

Download the sample project from the file cabinet: AntTut.zip
More Info: http://www.tutorialspoint.com/ant/index.htm

Java: Dynamic Polymorphism

Polymorphism in Java has two types:
1. Compile time polymorphism (static binding)
2. Runtime polymorphism (dynamic binding).

Method overloading is an example of static polymorphism, while method overriding is an example of dynamic polymorphism.

public class Vehicle {
    public void move() {
        System.out.println("Vehicles can move!");
    }
}

class MotorBike extends Vehicle{
    public void move(){
    System.out.println("MotorBike can move and accelerate too!");
    }
}

class Test{
    public static void main(String[] args) {
        Vehicle vh = new MotorBike();
        vh.move(); // prints MotorBike can move and accelerate too!
        vh = new Vehicle();
        vh.move(); // prints Vehicles can move!
    }
}

It should be noted that in the first call to move(), the reference type is Vehicle and the object being referenced is MotorBike. So, when a call to move() is made, Java waits until runtime to determine which object is actually being pointed to by the reference.  In this case, the object is of the class MotorBike. So, the move() method of MotorBike class will be called. In the second call to move(), the object is of the class Vehicle. So, the move() method of Vehicle will be called.

More Info: http://www.javatpoint.com/static-binding-and-dynamic-binding
Info on typecasting: http://www.c4learn.com/java/java-type-casting-inheritance/

Java: Interfaces


public interface Language {
    String getBirthday();

    String getGreeting();
}

public class Indonesian implements Language {
    public String getBirthday() {
        return "Selamat Ulang Tahun";
    }

    public String getGreeting() {
        return "Apa kabar?";
    }
}

public class English implements Language {
    public String getBirthday() {
        return "Happy Birthday";
    }

    public String getGreeting() {
        return "How are you?";
    }
}

public class LanguageDemo {
    public static void main(String[] args) {
        Language language = new English();
        System.out.println(language.getBirthday());
        System.out.println(language.getGreeting());

        language = new Indonesian();
        System.out.println(language.getBirthday());
        System.out.println(language.getGreeting());
    }
}



OUTPUT:
Happy Birthday
How are you?
Selamat Ulang Tahun
Apa kabar?


Java: Decompiling using Procyon

import java.io.BufferedReader;
import java.io.File;
import java.io.IOException;
import java.io.InputStreamReader;

import java.nio.file.FileSystems;
import java.nio.file.FileVisitResult;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.SimpleFileVisitor;
import java.nio.file.attribute.BasicFileAttributes;

import java.util.ArrayList;
import java.util.List;

public class ProcyonDecompiler {

    private static String JAR_DIRECTORY = "C:\\ucm";
    private static String DECOMPILER_FILEPATH = "lib/procyon-decompiler-0.5.29.jar";
    private static List<Path> fileList = new ArrayList<Path>();

    public static void main(String[] args) throws IOException, InterruptedException {
        walkDirectory();
        for (int i = 0; i < fileList.size(); i++) {
            Path absolutePath = fileList.get(i).toAbsolutePath();
            Path parentPath = fileList.get(i).getParent();
            Path fileName = fileList.get(i).getFileName();
            executeJarFile(absolutePath.toString(), parentPath.toString(), trimFileExtension(fileName.toString()));
        }
    }

    private static void executeJarFile(String absolutePath, String parentPath, String filename) throws IOException,
                                                                                                       InterruptedException {
        ProcessBuilder pb =
            new ProcessBuilder("java", "-jar", DECOMPILER_FILEPATH, "-jar", absolutePath, "-o",
                               parentPath + File.separator + filename);
        Process p = pb.start();
        BufferedReader in = new BufferedReader(new InputStreamReader(p.getInputStream()));
        String s = "";
        while ((s = in.readLine()) != null) {
            System.out.println(s);
        }
        int status = p.waitFor();
        System.out.println("Exited with status: " + status);
    }

    private static String trimFileExtension(String filename) {
        int pos = filename.lastIndexOf(".");
        if (pos > 0) {
            filename = filename.substring(0, pos);
        }
        return filename;
    }

    public static void walkDirectory() throws IOException {
        Path start = FileSystems.getDefault().getPath(JAR_DIRECTORY);
        Files.walkFileTree(start, new SimpleFileVisitor<Path>() {
            @Override
            public FileVisitResult visitFile(Path file, BasicFileAttributes attrs) throws IOException {
                if (file.toString().endsWith(".jar")) {
                    //System.out.println(file);
                    fileList.add(file);
                }
                return FileVisitResult.CONTINUE;
            }
        });
    }
}

Java: Extract a JAR file

import java.io.IOException;

import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.StandardCopyOption;

import java.util.Enumeration;
import java.util.jar.JarEntry;
import java.util.jar.JarFile;

public class UnzipJar {

    private static final String JAR_PATH = "D:\\test\\Java_gpl_v1.04.jar";

    public static void unzipJarFile(Path jar) throws IOException {
        if (!Files.exists(jar))
            return;

        String fnJar = jar.getFileName().toString();
        String fn = fnJar.substring(0, fnJar.lastIndexOf(".jar"));
        System.out.println(fnJar + " " + fn);
        Path dst = jar.getParent().resolve(fn);
        Files.createDirectory(dst);
        JarFile jf = new JarFile(jar.toString());

        //create directory
        for (Enumeration<JarEntry> enums = jf.entries(); enums.hasMoreElements();) {
            JarEntry entry = enums.nextElement();
            if (entry.isDirectory()) {
                Files.createDirectories(dst.resolve(entry.getName()));
            }
        }
        //copy file
        for (Enumeration<JarEntry> enums = jf.entries(); enums.hasMoreElements();) {
            JarEntry entry = enums.nextElement();
            if (!entry.isDirectory()) {
                Files.copy(jf.getInputStream(entry), dst.resolve(entry.getName()), StandardCopyOption.REPLACE_EXISTING);
            }
        }
    }

    public static void main(String[] args) throws IOException {
        unzipJarFile(Paths.get(JAR_PATH));
    }
}


Monday, June 22, 2015

SelectivelyRefineAndIndex Component

Component Information:
The ability to control the conversion and indexing of content is accessible through two resource includes defined in the component.

The first resource include is called "pre_submit_to_conversion". Before a content item is sent to the Inbound Refinery, this resource include is executed. Within this include, the administrator may manipulate the "dConversion" variable depending on the metadata of the document. Setting this variable to "PASSTHRU" will cause that content item to skip conversion, while setting it to "MultipageTiff" will cause that document to use the "MultipageTiff" conversion. Now, instead of merely relying upon file type to determine conversion settings, the administrator may use the value of any metadata fields.

Download from the link below:
http://www.oracle.com/technetwork/middleware/content-management/index-092832.html


I read this one note on metalink where the items are being checked in using Batchloader or IdcCommand. The files for these items are sent to the IBR for conversion. And the user wants that specific files that would normally be sent to the IBR be set for passthru and not be converted.

In the batchloader text file or hda file add the following parameters:

webViewableFile=<source file>.<source file extension>
webViewableFile:path=<source file path>/<source file>.<source file extension>
dWebExtension=<source file extension>

Example:
IdcService=CHECKIN_NEW
primaryFile=/tmp/AutoArchive.doc
dDocType=TEST
dDocTitle=TEST PASSTHRU 15
dSecurityGroup=Public
dDocAccount=Account1
dDocAuthor=pjolson
webViewableFile=AutoArchive.doc
webViewableFilePath=/tmp/AutoArchive.doc
dWebExtension=doc
xStorageRule=JDBC_Storage_Webless
<>

Note that this example is using a webless FSP rule. But even if designating the webviewable file, it still won't add it to the weblayout directory or the FileStorage table.

SortSpec when Configured for OracleTextSearch

When the Content Server is set for SearchIndexerEngineName=OracleTextSearch, only fields in SDATA sections can be used for sorting. Using SortSpec to sort the search results, the correct syntax is:
&SortSpec=<Field> <ASC or DESC>, <Field> <ASC or DESC>

Fields that are optimized as SDATA sections will require the prefix 'sd'. Two standard fields that are already optimized and will require the prefix are dDocName and dDocTitle.
&SortSpec=sddDocTitle ASC,sddDocName ASC,dInDate DESC

and this is how we will be passing the SortSpec key in the binder:
requestBinder.putLocal("SortSpec","sddDocName ASC, dInDate DESC");


To get the full list of SDATA sections
1. Go to UCM Administration --> Configuration. Note the Active Index value. It will be either ots1 or ots2.
2. On the Content Server database schema run the following spool script on the active index:
set long 2000000 
set pages 0 
set heading off 
set feedback off 
spool /tmp/outputfile.txt 
select ctx_report.create_index_script('<Active Index>') from dual; 
spool off

NOTE: The <Active Index> is the index the script will be run against. If the active index is ots1 the ctx_report.create_index_script will be run using FT_IDCTEXT1. If the active index is ots2 the report will be run using FT_IDCTEXT2.
Example: select ctx_report.create_index_script('FT_IDCTEXT1') from dual; 

After the spool completes edit the outputfile.txt file. Look for these entries: ctx_ddl.add_sdata_section. These are the fields that have been configured to be SDATA sections. These will include any fields that were set to be optimized in the Text Search Admin page. Also look for these entries: ctx_ddl.add_sdata_column. These also are SDATA sections. If the number of ctx_ddl.add_sdata_section and ctx_ddl.add_sdata_column entries equals 32 then no new fields can be set to be optimized.


To workaround this limit
Since the 32 limit cannot be increased, one way to reduce the number of SDATA section fields is to disable indexing on the custom metadata fields that won't be required to be searched on. In the ctx_ddl.add_sdata_section and ctx_ddl.add_sdata_column entries, these are the fields that will have an x or sdx prefix.

1. Go into Configuration Manager --> Information Fields
2. Select the field that isn't required for searches
3. Click the Edit button
4. Uncheck the Enable for Search Index box
5. Repeat the previous steps for the other unneeded fields
6. Rebuild the search index
7. Run the select ctx_report.create_index_script again and confirm that the field or fields are no longer marked as an SDATA section

NOTE: There is a fixed limit of 32 SDATA sections that can be present at one time. This limitation of Oracle Text has been solved as of database version 12.1.0.1.0.  This fix has also been backported to version 11.2.0.3.0.  It's also included in the 11.2.0.4.0 patchset for the database. See Note 1562142.1.
After upgrading or patching the database to raise this limit, there is another change needed for WCC to recognize the new limit.  See the Note:1607548.1

Friday, June 19, 2015

UCM: Running indexer from the command line

First make sure that stand alone applets are working. If not working, please check Note 1265076.1. Basically first its resetting the password of sysadmin user like
UPDATE USERS SET DPASSWORD='welcome1' WHERE DNAME='sysadmin';
UPDATE USERS SET DPASSWORDENCODING='' WHERE DNAME='sysadmin';

and then setting the JDBC connection in the System Properties application.

Create indexer.hda. This is the data that needs to be present in the file:
<?hda version="5.1.1 (build011203)" jcharset=Cp1252 encoding=iso-8859-1?>
#Full Collection Rebuild
@Properties LocalData
IdcService=CONTROL_SEARCH_INDEX
cycleID=rebuild
action=start
getStatus=1
fastRebuild=0
GetCurrentIndexingStatus=1
PerformProcessConversion=1
@end
<<EOF>>

Run the following command
IdcCommand -f C:\Work\indexer.hda -u sysadmin -l C:\Work\indexer.log

NOTE: To perform a "fast" rebuild instead of a "full" collection rebuild, set fastRebuild=1 in the HDA file

NOTE: If you receive Error: Executing 'CONTROL_SEARCH_INDEX command, then make the following entry in intradoc.cfg file
IdcCommandServerHost=10.141.107.1 or IdcCommandServerHost=localhost

UCM: ClassNames

Service
• This class can be specialized to override existing methods or add new methods. Service class includes methods for clearing local data, maping result sets, renaming values, loading and validating values, caching shared tables, checking security, forcing login, getting and filling user data, loading meta data options lists, and refreshing chached data.

DocService
• This class can be specialized to override existing methods or add new methods. DocService includes methods for executing services of DocServiceHandler and computing document URLs.

MetaService
• This class can be specialized to override existing methods or add new methods. MetaService includes methods for updating meta data definitions, updating option lists, updating templates, and getting option lists.

DocHandlerFactory
• DocHandlerFactory can be implemented to create a customized version of DocServiceHandler. The class which implements DocHandlerFactory must be registered to DocService using the static service setHandlerFactory(). DocServiceHandler includes methods for checking in content, handling subscriptions, handling the format wizard, loading default information fields, validating information fields, updating and deleting content items.

PageHandlerService
• This class can be specialized to override existing methods or add new methods. PageHandlerService includes methods for performing report queries and outputting historical reports.

WorkflowDocImplementor
• This class can be specialized to augment the behavior of workflow handling. WorkflowDocImplementor includes methods for handling general and criteria workflow.

RevisionImplementor
• This class can be specialized to augment the behavior of major and minor revision control.

SecurityImplementor
• SecurityImplementor can be implemented to override current security handling. You can extend ServiceSecurityImplementor to augment the current behavior.

Migrate contents present in framework folders to another instance

Replicating the folder structure for Framework Folders from one server to another is accomplished by using Archiver to replicate the necessary tables.

In Archiver, if you just want to move the folder structure, you would export/import these tables:
FolderFolders
FolderMetaDefaults

If you want to move content also, you will also need FolderFiles table.

When configuring the Archive to export, include the Framework Folders tables:

  + navigate to the "Export Data" tab
  + click on the "Table" tab
  + click "Add" and include the following tables: FolderFolders, FolderFiles, FolderMetaDefaults

This should allow new folders & folder changes to be migrated/replicated.

NOTE: by default all files/content will be archived as well. If you only want to export the Folder structure then add a dummy condition to the Export Query under the "Content" tab (e.g. where Content ID is -1)

See the documentation here for steps on how to add a table to an archive: http://docs.oracle.com/cd/E23943_01/doc.1111/e10792/c08_migration.htm#CHDJAABJ






Since migrating folders implies exporting tables, you can specify an export query, where you can tell which folder you want to export. 
You can for example specify in the export query the fFolderGuid of 'Folder2' so that only that folder gets exported. If you export the files you would have to export the FolderFiles tables, and specify the fParentGuid with the ID of 'Folder2' (fFolderGuid). 





If you want to export all sub folders, you need to create a custom Query Expression, that recursively gets all sub folders. 

For example, to export all subfolders from FolderFolders, starting with the folder with the ID 6CCC155115BC5BF7D9EEA691B1EBB41F, use: 

fFolderGUID IN (select ff.fFolderGUID 
from folderfolders ff 
connect by prior ff.ffolderguid = ff.fparentguid 
start with ff.ffolderguid = '6CCC155115BC5BF7D9EEA691B1EBB41F') 

You can re-use the same custom Query expression for FolderMetaDefaults 

to export all files in a folder hierarchy, you need to use the query below instead: 

fParentGUID IN (select ff.fFolderGUID 
from folderfolders ff 
connect by prior ff.ffolderguid = ff.fparentguid 
start with ff.ffolderguid = '6CCC155115BC5BF7D9EEA691B1EBB41F') 

Text Extraction Process manually: textexport

The text extraction process is performed using the "textexport" program that comes bundled in the ContentAccess component. This component contains Oracle OutsideIn functionality. The textexport program reads the PDF file and places all of the extracted text into the active collection folder (ots1 or ots2) under
<ucm-install>/search/ots1/bulkload/~export

To preserve this file, open the Repository Manager applet, and on the Indexing tab, click the Configuration button. On the popup that displays, the debug level can be set to trace.

If you want to see the text extraction process, you need to run the textexport manually. Create an HDA testfile.hda file where the input file parameter will need to be set to a valid path:
<?hda version="10.1.3.5.1 (111229)" jcharset=UTF8 encoding=utf-8?>
@Properties LocalData
OutputCharacterSet=utf8
blFieldTypes=
FallbackFormat=fi_unicode
InputFilePath=C:\Users\sonal\Downloads\pdf.pdf
blDateFormat=M/d/yy {h:mm[:ss] {aa}[zzz]}!mAM,PM!tAmerica/Chicago
@end

Run the following command from the cmd:
C:\Oracle\Oracle_ECM1\oit\win32\lib\contentaccess\textexport.exe -c C:\testfile.hda -f C:\finaltextfile.txt

finaltextfile.txt will contain the extracted text from the pdf file mentioned in the HDA file.
fi_unicode: Display as text and assume the Unicode character set.

Thursday, June 18, 2015

UCM: Indexing

To configure UCM to place a .hcst file in the weblayout directory instead of a copy of the native file, set IndexVaultFile=true. This will work only when the file is a passthru file (didn't go through IBR). The .hcst file in the weblayout points to the vault file only.
IndexVaultFile=true
NOTE: IndexVaultFile=true was replaced with UseNativeFormatInIndex=true. Either of these configuration settings will force the indexer to index the native file.
NOTE: When using webless storage, use UseNativeFormatInIndex=true. IndexVaultFile=true should not be used at all.

If the above env variable is set as true, and still the user wants to allow some documents to be copied to the weblayout directory
IndexVaultExclusionWildcardFormats=*/hcs*|*/ttp|*/xsl|*/wml|*template*|*/jsp*|*/gif|*/png|*/pdf|*/doc*|*/msword|*/*ms-excel|text/plain


When a large file is being indexed, and textexport times out, you can increase the timeout. The default value is 15 seconds.
TextExtractorTimeoutInSec=60
IndexerTextExtractionGuardTimeout=60


UCM will not index files larger than 10485760(10 MB) by default unless the configuration entry MaxIndexableFileSize is set (in this example 20 MB). Setting this to 0 (zero) stops full text indexing but still allows use of Oracle Text Search. This is useful if you still need case insensitive searches but do not need full text indexing.
MaxIndexableFileSize=20971520


This parameter lists what formats will be text indexed. If a file format extension is not on the list, the textexport will not get invoked and it will be indexed as metadata only.
TextIndexerFilterFormats=pdf,msword,ms-word,doc*,ms-excel,xls*,ms-powerpoint,powerpoint,ppt*,rtf,xml,msg,zip

More information in depth: Doc ID 445871.1

Wednesday, June 10, 2015

UCM: Deleting contents or folders inside a framework folders

Use FLD_DELETE service to delete files or folders.
binder.putLocal("IdcService", "FLD_DELETE")

// Deleting a folder. Use any of the 2 ways below
  binder.putLocal("item1", "fFolderGUID:69A93E7E99FA46CC35CBCEA0E1B9F8DB")
  // binder.putLocal("item1", "path:/Enterprise Libraries/My Library/Folder2")

// Deleting a file. Use any of the 2 ways below
  binder.putLocal("item2", "fFileGUID:8F9E18BB9D8609A0E07F391C8A3737F4")
  //binder.putLocal("item2", "path:/Enterprise Libraries/My Library/Folder4/file1.txt")


UCM: Checkin new content inside a framework folder using RIDC


public class CheckinFrameworkRIDC {

    public static void main(String[] args) {
        IdcClientManager manager = new IdcClientManager();
        try {
            // Creating a new IdcClient Connection using idc protocol
            //IdcClient idcClient = manager.createClient("idc://localhost:4444");
   //IdcContext userContext = new IdcContext("sysadmin");

            // Creating a new IdcClient Connection using HTTP protocol
            IdcClient idcClient = manager.createClient("http://localhost:16200/cs/idcplg");
     IdcContext userContext = new IdcContext("weblogic", "welcome1");
   
            HdaBinderSerializer serializer = new HdaBinderSerializer("UTF-8", idcClient.getDataFactory());
            DataBinder dataBinder = idcClient.createBinder();
            dataBinder.putLocal("IdcService", "CHECKIN_NEW");
            dataBinder.putLocal("dDocTitle", "Framework Folder Testing");
            dataBinder.putLocal("dDocType", "Document");
            dataBinder.putLocal("dSecurityGroup", "Public");
            dataBinder.addFile("primaryFile", new File("C:\\samplefile.txt"));
            dataBinder.putLocal("doFileCopy", "true");
            dataBinder.putLocal("dDocAuthor", "weblogic");
            
            // Either fParentGUID or parentFolderPath/fParentPath needs to be passed. Meatadata defaults are copied from the folder except SecurityGroup and Account
            //dataBinder.putLocal("fParentGUID", "5B0AC7C33BF951078772DFF757535B99");
            dataBinder.putLocal("fParentPath", "/Contribution Folders/ElPiju/Straw Bale/resources");
   
            serializer.serializeBinder(System.out, dataBinder);
            ServiceResponse response = idcClient.sendRequest(userContext, dataBinder);
            DataBinder responseData = response.getResponseAsBinder();
            serializer.serializeBinder(System.out, responseData);
        } catch (IdcClientException ice) {
            ice.printStackTrace();
        } catch (IOException ioe) {
            ioe.printStackTrace();
        }
    }
}


UCM: Framework Folders

When using the FrameworkFolders component, each folder is identified by its fFolderGUID and fParentGUID values. The fFolderGUID is the unique identifier for the folder, and the fParentGUID is set to the value of fFolderGUID for the folder's parent folder.



To list all the 2 contents under Cob folder:
SELECT ffileguid, ffilename, ddocname FROM folderfiles WHERE fparentguid='94B2FCB3A3D15A27E96B927CA80A3BD7';

To list all the 3 folders which are present under Cob folder:
SELECT ffolderguid, ffoldername FROM folderfolders WHERE fparentguid='94B2FCB3A3D15A27E96B927CA80A3BD7';

NOTE: A content item is not associated with fFolderGUID. It has only fParentGUID.

Thursday, May 21, 2015

UCM: contains operator in DATABASE.METADATA SearchEngine

While working on a content server where the SearchEngine is set as DATABASE.METADATA, we might require the <contains> operator for searching contents, just like in OracleTextSearch. This might come in handy when lets say there is a multivalued metadata field (xDepartment), and there are multiple values for the same (AB, BC, CD). In such a case, we normally think of enabling the OracleTextSearch engine and then go ahead. But in OracleTextSearch , the indexing is slow and the content's status might take time to be Released. Also the system doesn't need a fulltext search.

Oracle have provided a component DBSearchContainsOpSupport. Enable it and restart. Follow the "Zone Fields Configuration" link on Administration page and define indexes that would be full text indexed. That's it.

You can then check your results by modifying the querytext in the Advanced Query builder form
`BC` <contains> xDepartment

Wednesday, May 20, 2015

UCM: Component Creation

If you want to start building a UCM component from scratch, then download the ABCComponent.zip file from the Downloads. It contains most of the common resource files, common code for building services and filters, thereby saving your time.

Just open ComponentWizard applet, add a new component, provide a new name for the component, check the copy existing option, select the ABCComponent.hda file. After that, modify the component according to requirements.

Tuesday, May 19, 2015

UCM: Thumbnail generation

An interesting discussion was going on regarding the thumbnail generation in UCM. Refer these 2 links:
https://community.oracle.com/thread/3721623
https://community.oracle.com/thread/3723218

To summarize, you can use the basic set of thumbnail creation options provided with UCM (and no IBR). In order to access the thumbnail image use either of the services below.

  • GET_THUMBNAIL, pass the dDocName. This service is not documented in service reference guide but is present in the std_services.htm resource file.
  • GET_FILE, pass the dDocName as well as the Rendition, the value of which depends on what option you have selected in the Configure Thumbnail Options page as shown in the picture below (jpg, gif or png). Rendition=rendition:T OR rendition:P OR rendition:G. Also dRendition1 column of the Revisions table is populated with T/P/G.


NOTE: The native file has the dRenditionId value of "primaryFile". The Weblayout file has the dRenditionId value of "webViewableFile".  The attachment that you add gets stored with the dRenditionId value of "rendition:Z".

Wednesday, May 6, 2015

UCM: Migration of content files from one instance to another

Refer the link below in order to understand the entire process:
http://www.ateam-oracle.com/migrating-folders-and-content-together-in-webcenter-content/

Points:
1. Just set AllowArchiveNoneFolderItem=false in general Configuration Screen
2. Create an outgoing provider in the source instance, providing details of the target UCM instance.

3. This entire process works only for Folders_g component. If frameworkfolders component is enabled, and you try to access the Folder Archive Configuration page, then you get a !csJdbcGenericError error.


Setup Auto-Transfer
Auto-Replication should only be turned on once the initial Export, Transfer and Import steps have been completed.

On the Source archive
1. Click the Replication tab and in the Registered Exporters section click the Edit button. Check the box for Enable Automated Export. Click the Register button to register the Source instance as an exporter.
2. Select the Transfer To tab. Under Transfer Options, click the Edit button. Select the Is Transfer Automated option.

On the Target archive
1. Click the Replication tab in the Registered Importer section click the Register Self button.

Webcenter Sites: Asset, Asset Type, Asset definition

An asset represents a piece of content, such as an article or an image, that is managed in WebCenter Sites. It consists of a set of properties that helps distinguish it from other assets of the same type.

An asset type is a schema for distinguishing content types from one another. It is made up of a set of attributes, which can also be described as a set of fields on an asset creation form.

An asset type is a set of attribute names, while an asset is a set of attribute values.
An asset is an instance of an asset type.

When you create a new asset type for your CM site, a table in the database is also created, named for the asset type, that will store its instances (the assets). The columns in the database table correspond to the attribute names, while the rows correspond to its asset instances.

Asset definition controls the asset creation form. In particular, one can determine what attribute fields appear on the form and what order they appear in, as well as which attribute fields are mandatory versus optional.


Core asset types ship with WebCenter Sites by default. These are asset types that are typically used to support the structure of your CM site and website. Core asset types can be categorized as follows:
1. Container Assets: They "contain" lists of assets. Collection, Recommendation
2. Developer Assets: Query, Template, CSElement, Site Entry, Attribute Editor

Core "content" asset types does not ship with WebCenter Sites, such as articles, images, and media types.

All assets can be divided into two loosely defined categories: 
1. Content assets are items that visitors read and examine on your site.
2. Design assets are items that organize and format the content assets. Templates, CSElements, SiteEntry assets

The assets available to a site depend on the content applications and sample sites that are installed. As a general rule, the applications deliver design assets, and the sample sites deliver content assets.

WebCenter Sites: Product Architecture


WebCenter Sites The base application for web experience and content management
WebCenter Sites: Community Management of user-generated content, such as comments, ratings, and polling
WebCenter Sites: Gadgets Management of gadgets for use on websites such as iGoogle
WebCenter Sites: Engage Management of segments and strategic marketing tools
Content Connectors For integration with other source repository systems, such as WebCenter Content, Documentum, Sharepoint, or file systems
WebCenter Sites: Analytics Reporting of website content usage
Remote Satellite Server Edge caching application for larger-scale deployments

Add attachments without using the Managed Attachments window

SCENARIO: When using the WebCenter Content Application Adapters for PeopleSoft or E-Business Suite, attachments are typically added by logging into PSFT or EBS and opening the Managed Attachments popup window. Another method of adding attachments can be done with direct checkins to the content server, as long as certain parameters are added to the checkin data.

To add attachments without using the Managed Attachments window, the following three parameters must be added to a checkin service call. This may be useful in migrating content from one system to another, or initial batchloading. To understand how these parameters work, look at the AFObjects table in the WebCenter Content schema. This table stores these values on checkin.

  1. dAFApplication - This is the name of the EBS instance that is set as the application in the AXF_COMMAND_PARAMETERS table. If using the OAF pages in EBS, the application parameter is set in the OAF_AXF_CMD_PARAMS table.  If using PeopleSoft, this value is set in the PS_AXF_CMD_PARAMS table.  This parameter is usually the simplest one to determine, since many content items will be using the same value for this.
  2. dAFBusinessObjectType - This setting must match what is setup in PSFT or EBS to pass as the business object type. For EBS forms, this is determined using the AXF schema table AXF_FND_MAP. The AXF_FND_MAP.ENTITY_NAME column is used for EBS forms. For example, when using the Oracle form Invoices, the dAFBusinessObjectType is AP_INVOICES. If using the "Requisition" page on iProcurement, this value will be what is set in the OAF_AXF_CMD_PARAMS for that OAF page (e.g. Requisition, or REQ_HEADER, depending on how the page was setup initially in this table). When using PeopleSoft, the value will be set in the PS_AXF_CMD_PARAMS table.  If performing checkins outside of the Managed Attachments window, this value must match exactly. If the match is not exact, content items checked in outside of the Managed Attachments window will not show up when the business user logs into EBS and opens the attachments popup.
  3. dAFBusinessObject - This is the key value for the page or form. An example of the key value on the AP_INVOICES form is the invoice number. This value is typically an integer representing an ID for a invoice, requisition, purchase order, work order, etc. This must match exactly what is being used when the Managed Attachments windows is used for checkins. If the value used in direct checkins to the content server do not match, the attachments will not show up on the Managed Attachments page, and will be orphaned in the AFObjects table.


Understand that these fields are not WCC metadata fields, but data used in the AFObjects table. But like metadata field values, these can be set in local data for the checkin call to the content server. If using RIDC for a call the CHECKIN_NEW or CHECKIN_UNIVERSAL, these three settings can be added in the following manner, just like common metadata fields are added (dSecurityGroup or xCommencts are added in the same manner).
//Managed Attachments parameters - these 3 trip the AppAdpater filter so that the content item is attached via insert to the AFObjects table. 
dataBinder.putLocal("dAFApplication", "EBS_instanceA");
dataBinder.putLocal("dAFBusinessObjectType", "REQ_HEADERS"); 
dataBinder.putLocal("dAFBusinessObject", "181152");

If using a SOAP request for the checkin service call, the following example shows the fields that can be added when calling a checkin service using GenericSoapService (a.k.a GenericSoapPort).

<ns1:Field name="dAFApplication">EBS_instanceA</ns1:Field>
<ns1:Field name="dAFBusinessObjectType">AP_INVOICES</ns1:Field>
<ns1:Field name="dAFBusinessObject">1234</ns1:Field>

Again, to help understand how these values are stored on the WebCenter Content side, use SQL Developer or SQLPlus to review the AFObjects table.  To understand where these values come from on the application side, review each form or page and investigate the AXF schema to see how the form is configured to pass in the page name. For OAF pages, the table of most interest is OAF_AXF_CMD_PARAMS (and related OAF* tables). If using PSFT, the PS_AXF_CMD_PARAMS table contains the values. If using EBS forms, the AXF_FND_MAP and AXF_COMMAND_PARAMETERS table contain the values. 

Execute Enterprise Capture WLST Commands

Option 1:

1. Open a CMD prompt on your WLS server, and change to the "<ECM_HOME>\common\bin" directory (e.g. "C:\Oracle\Middleware\Oracle_ECM1\common\bin").
2. Run the wlst script ("wlst.cmd" on Windows or "wlst.sh" on Linux).
3. Connect to the server using the WLST command "connect(<userID>, <password>, <url>)". An example would be: connect('weblogic','welcome1','t3://localhost:7001')
4. Execute the domainRuntime() command.
5. At this point, any capture WLST command can be executed. To list the Capture commands, execute the help('capture') command to display a list of capture commands. You can then execute the help on each of the commands returned to get the syntax that is to be used.  e.g. help('listWorkspaces').

Option 2:

1. Open the EM console.
2. Navigate to the System Mean Browser (right click on capture_server1 and select from menu).
3. In the left pane, expand the tree "Application Defined MBeans".
4. Expand the folder oracle.capture.
5. Drill down to Server: capture_server1 / Application: capture / config / Config.
6. In the right pane, select the operations tab.
7. Select the desired operation.  The operation page will be displayed.
8. On the Operation pages, supply (if needed) the parameters and click "Invoke".

For the list of the commands, check the link below:
http://docs.oracle.com/cd/E29542_01/doc.1111/e37898/advd_functions.htm

Batch Content to Later Show as EBS Managed Attachments for Certain Business Entities

SCENARIO: You have numerous content items that you need to batch/bulk load into Universal Content Management (UCM) without using the EBS interface, and those items need to be tied/related to certain EBS entities.

Below is an example of using UCM IdcCommand to batch/bulkload the content items:

1. Gain a better understanding of what the EBS Managed Attachments code sends to UCM for a "New" check-in via enabling verbose system audit trace section "requestaudit" and adding an attachment to a specific business entity via the Managed Attachments EBS Zoom link. Example parameters sent to UCM are below:
requestaudit/6 10.26 17:09:44.854 IdcServer-7165:

CHECKIN_NEW [dID=8512][dDocName=UCM008514][dDocTitle=test][dUser=sysadmin][dSecurityGroup=AFDocuments][QueryText=dAFBusinessObjectType<matches>`PER_PEOPLE_F` <AND> dAFBusinessObject<matches>`6429` <AND> dAFApplication<matches>`EBS_instanceA`][xCollectionID=0][StatusCode=0][StatusMessage=Successfully checked in content item 'UCM008514'.] 0.16578499972820282(secs)

2. Using the values from the example UCM trace above, you can create a serialized hda file to use with IdcCommand. For your environment, you will need to change the values to meet your needs/trace results.

Example IdcCommand file name = <ucm install root>/bin/EBSCheckin.hda

<?hda version="10.1.3.3.2 (071031)" jcharset=UTF8 encoding=utf-8?>
doFileCopy=1
IdcService=CHECKIN_NEW
dSecurityGroup=AFDocuments
dDocType=EBSAttachment
dDocTitle=test
dDocAuthor=sysadmin
dDocAccount=
dRevLabel=1
dInDate=12/15/2010
dAFApplication=EBS_instanceA
dAFBusinessObjectType=PER_PEOPLE_F
dAFBusinessObject=6429
primaryFile=/home/oracle/Desktop/Hello.txt


3. Run IdcCommand to batch/bulk load the content item(s) from the <ucm install root>/bin directory:

IdcCommand -u sysadmin -f EBSCheckin.hda

4. IdcCommand should output "successful".

5. Log-into that same EBS entity (a Person's record).

6. Log into UCM's prompt for the Managed Attachments Zoom for that EBS entity as the same user specified in the serialized hda as the value of  "dDocAuthor".

7. Check that the batch/bulk loaded content item(s) is available via EBS.

Enable Managed Attachments for Oracle EBS Forms don't have FND Attachments configured at form level

For EBS forms that have FND Attachments enabled at the "Form" level one can configure Managed Attachments by populating the AXF_FND_MAP table (either using the AXF_MANAGED_ATTACHMENT_DATA.sql script or manually) and additional parameters can be added by populating the AXF_MA_PARAMETERS table.

For EBS forms and/or custom forms that have FND Attachments enabled at "Form Function" level only, you will need to use an alternate approach and manually add entries to the AXF_CONFIGS / AXF_COMMANDS / AXF_COMMAND_PARAMETERS tables.

You will need to collect the following values:

  • Entity (businessObjectType): the EBS entity/module (for example: AP_INVOICES)
  • Primary Key (businessObjectKey1) and Data Block Object (businessObjectValue1); the forms data block and attribute that holds the primary key value for the record (for example: INV_SUM_FOLDER.INVOICE_ID)
  • Formfunction name for the EBS form (for example: AP_APXINWKB)


DECLARE
my_entity VARCHAR2(255) := 'your_ebsentity'; --example: AP_INVOICES
my_formfunction VARCHAR2(255) := 'your_formfunction'; --example: AP_APXINWKB
my_datablockname VARCHAR2(255) := 'your_forms_datablock_name'; --example: INV_SUM_FOLDER
my_fieldname VARCHAR2(255) := 'your_datablock_fieldname'; --example: INVOICE_ID

v_formId AXF_CONFIGS.FORMID%TYPE;
v_eventId AXF_COMMANDS.EVENTID%TYPE;
v_solutionendpoint AXF_CONFIGS.SOLUTIONENDPOINT%TYPE;

BEGIN
select AXF_CONFIGS_SEQ.NEXTVAL into v_formId from dual;
select SOLUTIONENDPOINT into v_solutionendpoint from AXF_CONFIGS where formfunction='AXF_MANAGED_ATTACHMENTS';

Insert into AXF_CONFIGS (FORMID,FORMFUNCTION,SOLUTIONENDPOINT,ENTITYNAME,LOGENABLED,DATABLOCKNAME,CREATED_BY,CREATION_DATE,LAST_UPDATE_DATE,LAST_UPDATED_BY,LAST_UPDATE_LOGIN) values (v_formId,my_formfunction,v_solutionendpoint,null,'YES','AXF_DEFAULT',0,sysdate,sysdate,0,0);

select AXF_COMMANDS_SEQ.NEXTVAL into v_eventId from dual;
Insert into AXF_COMMANDS (EVENTID,FORMID,EVENTNAME,DISPLAYMENU,COMMANDNAMESPACE,REQUIRESCONVERSATION,SORTBY,SOLUTIONNAMESPACE,MENUTYPE,SPECIAL,RESPONSIBILITY,CREATED_BY,CREATION_DATE,LAST_UPDATE_DATE,LAST_UPDATED_BY,LAST_UPDATE_LOGIN) values (v_eventId,v_formId,'ZOOM','Managed Attachments','UCM_Managed_Attachments','NO',3,'UCM_Managed_Attachments','ZOOM',null,null,0,sysdate,sysdate,0,0);

Insert into AXF_COMMAND_PARAMETERS (PARAMETERID,EVENTID,PARAMETERNAME,DATASOURCENAME,DATABLOCKNAME,FIELDNAME,CONSTANTVALUE,CREATED_BY,CREATION_DATE,LAST_UPDATE_DATE,LAST_UPDATED_BY,LAST_UPDATE_LOGIN) values (AXF_COMMAND_PARAMETERS_SEQ.NEXTVAL,v_eventId,'application','CONSTANT',null,null,'EBS_instanceA',0,sysdate,sysdate,0,0);

Insert into AXF_COMMAND_PARAMETERS (PARAMETERID,EVENTID,PARAMETERNAME,DATASOURCENAME,DATABLOCKNAME,FIELDNAME,CONSTANTVALUE,CREATED_BY,CREATION_DATE,LAST_UPDATE_DATE,LAST_UPDATED_BY,LAST_UPDATE_LOGIN) values (AXF_COMMAND_PARAMETERS_SEQ.NEXTVAL,v_eventId,'businessObjectType','CONSTANT',null,null,my_entity,0,sysdate,sysdate,0,0);

Insert into AXF_COMMAND_PARAMETERS (PARAMETERID,EVENTID,PARAMETERNAME,DATASOURCENAME,DATABLOCKNAME,FIELDNAME,CONSTANTVALUE,CREATED_BY,CREATION_DATE,LAST_UPDATE_DATE,LAST_UPDATED_BY,LAST_UPDATE_LOGIN) values (AXF_COMMAND_PARAMETERS_SEQ.NEXTVAL,v_eventId,'businessObjectKey1','CONSTANT',null,null,my_datablockname||'.'||my_fieldname,0,sysdate,sysdate,0,0);

Insert into AXF_COMMAND_PARAMETERS (PARAMETERID,EVENTID,PARAMETERNAME,DATASOURCENAME,DATABLOCKNAME,FIELDNAME,CONSTANTVALUE,CREATED_BY,CREATION_DATE,LAST_UPDATE_DATE,LAST_UPDATED_BY,LAST_UPDATE_LOGIN) values (AXF_COMMAND_PARAMETERS_SEQ.NEXTVAL,v_eventId,'businessObjectValue1','DATA',my_datablockname,my_fieldname,null,0,sysdate,sysdate,0,0);

END;
/

The following screenshot displays the entries for transactions in Receivables.