Tuesday, December 9, 2008

Finding the Parent Path

If you have ever needed to hierarchical extract data from an 8i system to put in a data integrator template for import into a TRIRIGA 9 system, you probably know about the challenge of pulling the parent path. The following SQL statement can be used to extract this data (this example is for Space Class Current):

select a.spec_id, a.sys_parent1, a.classname,
(select b.fullpath from tridata.t_spaceclasscurrent b where b.spec_id = a.sys_parent1) "PARENTPATH"
from tridata.t_spaceclasscurrent a
order by a.fullpath

Thursday, November 6, 2008

Business Connect

If you are using Business Connect for integration development and would would like a tool that can log into your application server, show all of the available methods and allow you to manually execute them, you may want to consider SoapSonar by Crosscheck Networks (http://www.crosschecknet.com/download/download.php). They offer both an enterprise edition (with a free trial) and a free personal edition that will provide access to the Business Connect WSDL Methods.

To use the product, enter your WSDL location (i.e., http://yourServerName:8001/ws/TririgaWS?wsdl) and click on the Capture WSDL button. You will see a screen that looks like this:


















To execute a method use the tree on the left to drill into it. Once the method is selected look at the schema fields on the request, enter any required data (depending on the method you are calling), set the authentication to use your API user, and click execute.

In the example below, I executed getModules which has no required input. The screen show shows the response from the server.




Monday, September 8, 2008

Add a Documents Tab to a Custom Object

Adding a Documents tab to a custom object is a very simple process. Here’s what you do:
• In the Data Modeler, create an association between your object and the Document module, Document object using an association string of Has Document with a reverse string of Is Document For
• Revise and republish the business object
• Open the GUI Builder and add a xxDocuments tab
• Under your new tab create a query section called xxRelatedDocuments using Document - Display - Associated to Current Record as the query, setting the Association Type to Has Document
• Re-label the DeAssociate action to Remove
• Add a new action called Upload, check the Pop Up box, set the Action Type to Custom and enter this URL: /html/en/default/docmgmt/objectupload/objectFrameSet.jsp?folderId=-1&showFolders=true&multiFile=true
• Publish your GUI

Creating a System tab on a Custom Object

The information contained on the out-of-box System tab can be very useful for administrators and power users. Here is how to extend a System tab to a custom object:

Add the following fields to your business object using Find:
  • triLanguageLI
  • triFormLabelSY
  • triBusinessObjectLabelSY
  • triRecordNameSY
  • triProjectNameSY
  • triRecordStateSY
  • triCreatedSY
  • triModifiedByTX
  • triModifiedSY

I also like to include spec_id which is:
triRecordIdSY

For the Modified By field to work, you will have to perform a few additional steps:
  • Create a new association to the triPeople module, triPeople business object with an association string of Auto Recorded By and a reverse association of Auto Recorded.
  • Set the triModifiedByTX field to be a locator field. Locate using triPeople, Auto Recorded By and map to triNameTX
  • If your object has a sub action that calls xxBusinessObjectName - Synchronous - Permanent Save Validation, it will call the xxBusinessObjectName - Synchronous - Module level business rules workflow which will update the Modified By field. If not, you will need to write workflows to update this field on any given state transition

Publish the Business Object then open the GUI builder. Create a new tab called xxSystem and a new section under that tab called xxRecordInformation. Add the above fields to the layout as shown below and Publish the GUI.








If you are using triModifiedByTX, don’t forget to include a locator query on the Modified By field.

Tuesday, August 12, 2008

How to view long-running SQL Processes with Oracle 10g Enterprise Manager

To see the log running queries:

  • Open the Oracle Enterprise Manager 10g Web-interface (typically found at http://DBServer:1158/em/console)
  • Click on the Performance Tab
  • At the bottom of the page, there is a Top Activity link
  • In that page, there will be a Top SQL link. If you click on the SQL ID, it will bring up the statement that was executed.
  • If it is a select, there will be a Plan tab, it will show how expensive the statement is.
  • If it looks like a TRIRIGA query, either coming from a manager, portal, or GUI query section, the query should be refactored and/or redesigned to allow for better performance.
  • The report can normally be easily identified by the tables or Module-views it selects from, so for example, if the statement is question looks like:

SELECT ROW1, ROW2, ROW3

FROM T_TRIBUSINESSOBJECT

WHERE ROW1 like ':1'

  • Look in the report manager for a report/query/graph, etc, that is looking at the TRIBusinessObject BO.
  • Test different reports by opening the report, and clicking the "Export SQL" link. The sql can be verified against what Oracle is reporting back

For more information on how to trace long SQL from long running queries, please refer to my earlier post: http://www.tririga.info/2008/01/analyzing-long-running-sql-queries.html

Gathering Database Statistics

In order to improve performance TRIRIGA recommends gathering database statistics on a daily basis. Normally this is done as part of the daily cleanup jobs. If you notice poor system performance on a day with particularly heavy usage or during a data load you can manually gather statistics by running the following scripts.

EXECUTE DBMS_STATS.GATHER_SCHEMA_STATS('TRIDATA',DBMS_STATS.AUTO_SAMPLE_SIZE);

If 2.1.9.X is being used, also run:
EXECUTE DBMS_STATS.GATHER_SCHEMA_STATS('TRIRIGA',DBMS_STATS.AUTO_SAMPLE_SIZE);

Thursday, July 10, 2008

Capturing Traffic Between the App and Database Servers

In order to troubleshoot issues you are having you may want to capture the network traffic between your application and database servers. This will allow you to see SQL statements being passed to the database, along with a lot of other potentially useful information. The basic procedure is outlined below (thanks to Eric Glass):


Here are some general instructions for capturing traffic sent from the application server to the database server:

1) Stop the application server.
2) Set up a remote SSH tunnel on any free port on the application server (e.g., 9999), pointing to the database host/port. The exact terminology used for this varies by SSH client, but typically it would be:
Source Port: 9999
Destination Host: database server
Destination Port: database port (typically 1521 for Oracle, 1433 for SQL server)
For Putty, the required settings can be found in the connection configuration dialog under SSH -> Tunnels (source port would be "9999", destination would be of the form "host:port", e.g. "mydbserver:1521"). For OpenSSH, the command line parameter would be "-R 9999:mydbserver:1521".
3) Make the SSH connection. You will need to leave the shell window open for the duration for the tunnel to remain open.
4) Start Wireshark. Go to Capture -> Interfaces; click "Options" for the interface you want to capture. In the Capture Options dialog, enter the following in the Capture Filter:
host mydbserver && port 1521
(substituting the appropriate database server host/port). Ensure "Update list of packets in real time" is checked. Click "Start" to start capturing.
5) Test the tunnel; make another SSH connection to the application server, and type:
telnet localhost 9999
This will open a connection to the remote end of the tunnel; traffic will be routed over the SSH connection, and an outgoing connection will be made from your client box to the database server on the destination port. You should see the outgoing connection traffic in Wireshark. Close the second SSH window and restart capturing in Wireshark (Capture -> Restart from the menu). Alternatively, you can revise the capture filter at this point to also include web traffic from the browser to the application server:
(host mydbserver && port 1521) || (host appserver && port 80)
6) Modify the application server JDBC configuration to point to port 9999 on 127.0.0.1. The application server will be connecting to the tunnel source port locally, which will spawn an outgoing connection from your client box to the database server (which will be captured).
7) Restart the application server and verify that traffic is received in Wireshark.
8) Restart capturing in Wireshark, perform the desired operation in the application, then stop capturing in Wireshark. This will ensure you get a concise capture with minimal "noise" from other application activity.
9) Save the capture file, and reset your configuration to normal.

Note that technically, you don't have to establish the SSH tunnel directly to the application server; any SSH server will work (i.e. a Unix box completely unrelated to the application system). You would need to ensure the source end of the tunnel on the SSH server accepts connections from other hosts (typically an option in the tunnel configuration settings); you would then point the application server to that box instead of 127.0.0.1.

Once you have the capture, there are a variety of tools Wireshark provides to analyze the capture. For example, select Statistics -> Conversations from the Wireshark menu and click the "TCP" tab; this will show all TCP connections active during the capture period. You can start examining these by selecting an entry from the list, right clicking, and selecting "Apply as Filter -> Selected -> A<->B" from the popup menu. This will filter the packet window to display only that connection. You can also view or save all the traffic for a connection as text by selecting a packet in the list and choosing "Follow TCP Stream". For many protocols, Wireshark includes dissectors which will parse out the traffic and present it semantically (for example, it can decode HTTP traffic and display request/response information). I don't believe there is currently a dissector for Oracle, but you can view the raw TCP streams to see the SQL statements going over the wire

Monday, July 7, 2008

Identifying Linked Excel Cells

If you need to modify an Excel spreadsheet that has cells linked to TRIRIGA, it is important to identify which cells are linked. Inserting rows into the spreadsheet will not change the original linked locations so this will result in significant remapping.

Excel linked cells are stored in the following Table ‘SOBJTYPE_EXCEL_PARAMS’. The spec_template_id column is actually the GUI_ID that the excel template is embedded in, not the business object’s spec_template_id. To look up the GUI_ID look up the GUI name in the following Table ‘GUI_FORMS’. This will give you the spec_template_id to use when finding all the excel links.

Use this script and update the spec_template_id to match the GUI you need to pull the cell links from. In this example it is the ‘triRETransactionLeaseTerms’ GUI.

Select EXCEL_ROW, EXCEL_COL, PARAM_ID, PARAM_STR, PARAM_DISP_STR, ROW_NUM TABLE_VIEW_FLAG from SOBJTYPE_EXCEL_PARAMS where spec_template_id = 10003842;

Thursday, June 26, 2008

Capturing HTML Packets

While troubleshooting TRIRIGA, you may find yourself needing to analyze the traffic between a workstation and the server or the traffic between your app/process server and the database server. Here are the basic instructions for using the open source application Wireshark to capture this network traffic.

Download and install Wireshark from here:

http://prdownloads.sourceforge.net/wireshark/wireshark-setup-1.0.0.exe

After you get it installed, start it up; under the menu go to "Capture -> Interfaces". It should pop up a dialog showing all the network interfaces on the box; generate some traffic (open a web page etc.) and the "Packets" column should start counting up. That's the interface you want to sniff. From that dialog, click the "Options" button next to the interface that you identified. In the "Capture Filter" field of the options dialog, enter the following:

host ServerName && (port 80 || port 443 || port 8080 || port 8443 || port 8001)

(assuming ServerName is the server you want to monitor). Then click start. Go perform whatever operation you need to trace. Then select "Capture -> Stop" from the menu when it's done. Then save the result (pcap format should be the default, which is correct).

Wednesday, June 18, 2008

Multi-Line Field Labels

In the GUI, if you have a long label text it pushes the other fields in the section out to the same length which can make the layout look bad. For example, in the image below the "Please enter the Estimated Retirement Date" field label is too long.


To solve this issue an break sequence can be added to split the label to multiple lines as shown below.


Here are the steps:
  1. In tririgaweb.properties, remove "<", ">" and "&" from the EXCLUDE_CHARACTERS line. It should look like this: EXCLUDE_CHARACTERS={
  2. Save tririgaweb.properties. Make the same changes on your process server, if applicable
  3. Restart Jboss, Weblogic or WebShpere on all app and process servers
  4. In the GUI Builder, Revise the GUI, select the field and edit the label as shown below by inserting a break sequence where you want to split the label
  5. Apply the changes and Publish the GUI







Tuesday, May 6, 2008

Change the Browser Title

If you want to change the Browser Title (the words that display at the top of your web browser), simply add the following line to tririgaweb.properties:

browserTitle=Workplace Central

Thursday, May 1, 2008

Setting up Flexible Menus

In TRIRIGA 2.5 and higher flexible menus can be added to the portal. These are essentially short cuts to portal sections. The image below shows what flexible menus look like.

Setting these up is not very difficult, and to make it just a little easier a colleague of mine put together a document describing exactly how to do it. Click here to download the PDF document.
Thanks Srinath!

Troubleshooting System Performance

A big complaint at many organizations running TRIRIGA is that performance is not where they want it to be. Performance tuning can be very challenging and is often not the result of any one problem; typically there are several contributing issues. Here is a list of things to look at if you are having performance issues:

  • Make sure your hardware is appropriately sized. TRIRIGA will typically help you with this. You just need to provide them the details of how you use (or anticipate to use) the system, such as number of users, average number of new objects you add per day, number of objects in the system etc.
  • Verify that your database and application/process servers have sufficient memory allocated to them. Make sure you have sufficient memory on your severs to support the values you use
  • Optimize portal performance by removing as many queries as possible from portals and eliminate or at least cut back on the use of auto-refresh.
  • Upgrade to a current platform release – TRIRIGA is constantly improving the platform. Check the release notes for all of the more current platforms to determine if any new enhancements can help you.
  • Try to determine if the performance issues are related to the database or to process/app servers. Check memory & CPU usage to help determine this
  • Have a DBA capture long running SQL queries by running Statspack, and try to determine where they originate in the system. Many come from queries within the system that can be tuned to perform better. I have a post on my blog about this: http://www.tririga.info/2008/01/analyzing-long-running-sql-queries.html.
  • Disable any unnecessary workflows. For example, if you don't use out of box approvals or notifications you can bypass a lot of system overhead by disabling these workflows, even on a per-module basis. (Be careful here if you are not familiar with workflows)
  • Check the database for disproportionate table sizes. I had a customer with 250 million rows in IBS_SPEC_ASSIGNMENTS and only 10 million rows in IBS_SPEC
  • Get a count of all objects in the database to be sure what you have is what you expect; you may find yourself with 250k discussion threads that you never use. Here is an article in my blog on how to do this: http://www.tririga.info/2008/01/get-count-of-all-objects-in-trirga-9.html
  • Try to isolate specific actions that perform poorly, like creating a particular type of record. Analyze what the system does internally when these actions occur. Try to determine if a specific workflow or query is causing the problem and work to fix it.

If you have anything to add to this list, please let email me and let me know.

Thursday, April 24, 2008

Find Corresponding Object and Module Names

Do you know the name of a BO you are looking for, but can't figure out which module it's in? Here is a script to run under the tririga schema:

/* Get BO and corresponding Module Names - run in tririga schema */
SELECT b.NAME AS bo_name, m.module_name
FROM module_view_map m, ibs_spec_type b
WHERE m.module_id = b.spec_class_type
ORDER BY NAME

Tuesday, April 22, 2008

Utility for Viewing Configuration and Log Files

Notepad++ is a fantastic, open source utility to view and edit TRIRIGA configuration and log files (it has many other uses as well). If you are tired of looking at a jumbled mess in notepad, try Notepad++.

Monday, April 21, 2008

Locator Fields and Object Metadata Cache

If you enter data into a text locator field and as soon as you tab out the value you selected disappears, you may want to flush the Object Metadata Cache before restarting Jboss or Weblogic. I have seen this several times with the TRIRIGA 9 application where I select a valid value from a locator query and as soon as I click OK on the query with field wipes itself out. A restart would solve the problem, but a cache flush is faster and does not disrupt the end users.

Wednesday, April 9, 2008

Turn an Error Message into a SQL Statement

Consider the following error message from the server.log:

2008-03-20 15:39:07,946 ERROR [com.tririga.design.smartobject.dataaccess.GenerateSqls](ExecuteThread: '13' for queue: 'weblogic.kernel.Default') Statement : update T_TRISERVICEREQUEST set sys_projectid = 1,triDescriptionTX = ?,triStatusCL = ?,triStatusCLObjId = ?,triControlNumberCN = ?,Sys_Parent1 = ?,triPreviousStatusCL = ?,triPreviousStatusCLObjId = ?,triRequestClassCL = ?,triRequestClassCLObjId = ?,triModifiedSY = ?,coContactNumberTX = ?,triRecordNameSY = ?,triRecordStateSY = ?,triModifiedByTX = ?,triModifiedByTXObjId = ?,triRequestFullNameTX = ? where spec_id = ?

Values :
test4,
Review In Progress,
1601066,
REQ 1000516,
0,
Draft,
1600256,
CRE Moves, ten or less associates (MOVES03),
4737839,
03/20/2008 03:39:07 PM,
xxx,
REQ 1000516-0,
triReview,
Zombron, Robert - eyk343,
3835835,
REQ 1000516-CRE Moves, ten or less associates (MOVES03),
6564922,

To troubleshoot why this error message was thrown, turn it into a SQL statement like the one below, run it in SQLPlus and use your SQL troubleshooting skills to solve the problem. In this example. the triRequestFullNameTX field was set to 50 characters but the data we were trying to push had 55 characters. The solution was simply expanding the percision of the offending field to a number larger then 55.

update T_TRISERVICEREQUEST
set sys_projectid = 1,
triDescriptionTX = 'test4',
triStatusCL = 'Review In Progress',
triStatusCLObjId = 1601066,
triControlNumberCN = 'REQ 1000516',
Sys_Parent1 = 0,
triPreviousStatusCL = 'Draft',
triPreviousStatusCLObjId = 1600256,
triRequestClassCL = 'CRE Moves, ten or less associates (MOVES03)',
triRequestClassCLObjId = 4737839,
triModifiedSY = '03/20/2008 03:39:07 PM', -- NOTE may need to be formatted correctly for this column
coContactNumberTX = 'xxx',
triRecordNameSY = 'REQ 1000516-0',
triRecordStateSY = 'triReview',
triModifiedByTX = 'Zombron, Robert - eyk343',
triModifiedByTXObjId = 3835835,
triRequestFullNameTX = 'REQ 1000516-CRE Moves, ten or less associates (MOVES03)'
where spec_id = 6564922

Thursday, April 3, 2008

Site Style Manager Tip

If you ever find yourself in a position where you want to use Site Style Manager to update the color of a particular screen object and you can figure out which component it is, all you have to do is view source on in your browser. Search for the label (if it's text) and right after the title look for what the class value is. That is your answer.

Wednesday, March 26, 2008

General Troubleshooting

Okay...something is acting weird. A field does not display correctly, the app stops working correctly, a workflow says its doing one thing but it looks like it's doing something else. What do you do to troubleshoot the problem? Here are some suggestions:

  1. Look for a pattern. Can you recreate what just happened?
  2. Look in the server.log (and output.log for Weblogic). Do you see any strange error messages? (I will post a future blog entry for troubleshooting log messages)
  3. If the problem appears to be with workflows, turn on workflow logging and look at every workflow that fires. Many workflows call other workflows, so this could take a few iterations.
  4. Are the issues related to performance? If you have access to a DBA, ask them to help identify the long running SQL queries and run and explain plan. Look at my posting about analyzing long running SQL queries.
  5. If the issues are related to specific records in the database, use TOAD (or a similar tool) to compare good records with bad records in both the T_BONAME table and IBS_SPEC table. Are there any inconsistencies between good and bad records?
  6. Are objects stuck in a "bad" state? Use TOAD to be sure the sys_objectstate in the T_BONAME table and object_state in the IBS_SPEC table - be sure these match. These field can be updated to put the object into a valid state.
  7. Are fields not getting filled out by workflow as expected? Check the field length of the desination fields to be sure you're not trying to push 150 characters into a 100 character field
At some point in the future I will expand on this topic.....

Monday, March 17, 2008

Object Name Exists Error Message

If you see an error message in the application that reads "Record can not be saved as another Record with the same name exists within the Business Object" and you can't find another record with the same name in the application, you may want to run the following SQL query to see if you have any records in the database where the spec_name field is null in the IBS_SPEC table. The query below will show a count of these records with the biggest offenders at the top of the list.

select count(*), type_name from IBS_SPEC
where spec_name is null
group by type_name
order by count(*) desc

Fixing Objects Stuck in the Wrong State

This post was removed at TRIRIGA's request.  Please email me if you need additional information.  

Friday, March 14, 2008

Search For Orphan Associations

The script below will run for a very long time but should ultimately come back with all the BOs that have "orphans" in IBS_SPEC_ASSIGNMENTS and total distinct orphan count for each.

SELECT ist.NAME "BO Name", ist.spec_template_id "BO ID",
COUNT (1) "Orphan Count"
FROM (SELECT DISTINCT spec_id, spec_template_id
FROM (SELECT isa1.spec_id, isa1.spec_template_id
FROM ibs_spec_assignments isa1
WHERE NOT EXISTS (
SELECT 1
FROM ibs_spec ibs1
WHERE ibs1.spec_id =
isa1.spec_id)
UNION ALL
SELECT isa2.ass_spec_id, isa2.ass_spec_template_id
FROM ibs_spec_assignments isa2
WHERE NOT EXISTS (
SELECT 1
FROM ibs_spec ibs2
WHERE ibs2.spec_id =
isa2.ass_spec_id))) orphans
INNER JOIN
ibs_spec_type ist ON ist.spec_template_id = orphans.spec_template_id
GROUP BY ist.NAME, ist.spec_template_id
ORDER BY 3 DESC;

Momentum Migration Fails on Validation

If the "heartbeat" from the upgrade conversion stops during
the validation phase (server.log file)
indicates "Validating row 18000/400000" or similar) then this
is the script to run as user TRIRIGA on the database:

UPDATE ENVIRONMENT_PROPERTIES SET VALUE = 'false' WHERE
ENVIRONMENT = 'SYSTEM' AND PROPERTY = 'LOCK';
COMMIT;

Then, if necessary, kill the java processes on app server (killall -KILL
java) and restart Jboss/weblogic instance; it will skip the
validation step and go on.

Tuesday, March 4, 2008

Monitoring System Workflows

Workflows generated by the system, such as ones triggered by scheduled events, can be difficult to monitor in the Administrator Console since they do not have a user name associated with them. If you run the following SQL against the tririga schema, these events will now show that they are owned by "Crazy User" and can be viewed in any of the Workflow Event Queue Statistics options.

INSERT INTO user_credentials
(user_id, company_id, user_account, admin_console_permissions)
VALUES
(0, 208133, 'Crazy User', 0);

Tuesday, February 26, 2008

Monitoring Workflows

The following scripts can be used to show you how many workflows are being created every hour and how many workflows are being processed every hour. This can be useful in monitoring system performance, especially when data loading or other large operations are occurring.

--Workflows Created per Hour (in the queue to be processed)
select count(*), to_char(created_date, 'mm/dd/yyyy HH24') from wf_event
group by to_char(created_date, 'mm/dd/yyyy HH24');

--Workflows Processed per Hour
select count(*), to_char(completed_date, 'mm/dd/yyyy HH24') from wf_event_history
group by to_char(completed_date, 'mm/dd/yyyy HH24');

Tuesday, February 5, 2008

Rebuilding Oracle Database Indexes

This post was removed at TRIRIGA's request.  You can email me for additional information.  

Simulated SMTP Server

Have you ever wanted to test email notifications without actually sending emails to a real SMTP server? The attached Java program allows you to do just that. Assuming you have Java installed on your computer, it takes about 30 seconds to set your PC up as a simulated SMTP server.

Setup
Setting it up is easy. Create a folder on your PC, for example c:\smtp. Copy the smtp.jar file to your new folder and at a command prompt navigate to the new folder and run the following: C:\SMTP>java -jar smtp.jar. Leave the command prompt box open.

The final step is setting your TRIRIGA application to use your PC as it's SMTP server. Edit the tririgaweb.properties file (or use the System Manager in the Admin Console) and set mail.smtp.host = your local IP address. All that left is to restart JBoss/Weblogic so the application uses the new settings.

Usage
Once everything is set up as described above usage is pretty straightforward. Use the application to trigger a notification (close a work task, etc.) and watch your command window. You will see log entries each time the system receives a message. It should look similar to this:

C:\SMTP>java -jar smtp.jar
Starting session with /10.119.216.152
Closing session with /10.119.216.152

Now open Windows Explorer and navigate to the C:\SMTP directory. In addition to the smtp.jar file you should have two new files, xxxxxxxxx-x.log and xxxxxxxxx.mht. The log file show the raw text of the email message and the mht shows the formated text. You can open the mht file is a browser to look at the content, or change the .mht extention to .eml and open the file in Outlook Express.

Wednesday, January 30, 2008

Troubleshooting Workflow Performance

Have you ever had some stuff that was taking a long-time to run, and wanted to figure out where it was spending it's time. These SQL statements may help you figure it out. You need to change the value of the date string to the point in time you want to start tracking...

-- ----------------------------------------------------------------------
-- Lists all workflows instances that have fired since a given date and
-- time, in the order they fired along with their status
-- ----------------------------------------------------------------------
select wf_name, bo_type_name, bo_event_name,
case
when status_id = 1 then 'In Progress'
when status_id = 40 then 'Running'
when status_id = 90 then 'Stopped'
when status_id = 100 then 'Completed'
else 'Unknown'
end as status
,start_date, end_date
from wf_template
where template_flag not in (10,20,25,30) -- 10,20,25,30 are templates, everything else is an instance
--and status_id <> 100 -- 100 are completed, so uncomment this to see only those that haven't finished running
and created_date > to_date('01/29/2008 01:23:45 PM', 'MM/DD/YYYY HH:MI:SS AM')
order by created_date asc

-- ----------------------------------------------------------------------
-- Lists workflows instances that have fired since a given time, along
-- with how long it took for them to process
-- ----------------------------------------------------------------------
select wf_name, bo_type_name, bo_event_name, trunc(86400*(end_date-start_date))-60*(trunc((86400*(end_date-start_date))/60)) as processing_time_secs

from wf_template
where template_flag not in (10,20,25,30) -- 10,20,25,30 are templates
and created_date > to_date('01/29/2008 01:23:45 PM', 'MM/DD/YYYY HH:MI:SS AM')
order by 4 desc

-- ----------------------------------------------------------------------
-- Gives a average processing time and number of times a given workflow
-- (by name) has fired since a specified date and time
-- ----------------------------------------------------------------------
select wf_name, count(*), avg(trunc(86400*(end_date-start_date))-60*(trunc((86400*(end_date-start_date))/60))) as processing_time_secs

from wf_template
where template_flag not in (10,20,25,30) -- 10,20,25,30 are templates
and created_date > to_date('01/29/2008 01:23:45 PM', 'MM/DD/YYYY HH:MI:SS AM')
group by wf_name
order by 3 desc

Thursday, January 24, 2008

Bypassing the TRIRIGA 9 Approval Process

Do you use the built-in approval process in TRIRIGA 9? If you do not use it, but you don't follow the steps outlined below you are allowing your system to generate unnecessary business objects, create unnecessary associations and run unnecessary workflows. All of these unnecessary items result in slower system performance and increased space usage.


Using assets as an example, most of my clients never use the out-of-box approval process for activating an asset. Here's what I do to bypass the approval process

1. Retire Workflow: triAsset - triActivate - Submit for Approval and Dependant Record Update
2. Modify Workflow: triAsset - triActivate - Make Dependant Records Read Only (add a step at the end of the workflow to have it call triFinalApprovalHidden)

By doing this your are calling executing the hidden approval step without creating any of the intermediate approval objects. It is also easy to put this back in place if you choose to use the approval process in the future, simply reverse the process. The workflows for most objects have the same name as the one's above (except for the BO name) and the process to disable them is the same.

Additionally, most of my clients don't need or want any notifications on status change of an object. If you don't need a notification on any status change for a particular object (assets in this example), bypassing this process is easy, simply retire the following workflow: triAsset - Associate - triStatus - Create Notification Details current status. The saves the system from having to run this workflow for every status change of every object. To turn it back on, just republish the workflow.

Tuesday, January 22, 2008

IBS_SPEC_ASSIGNMENTS Table

Is your IBS_SPEC_ASSIGNMENTS table getting too large? One of my clients currently has 19 GB of data in over a quarter billion rows in this table alone. Here is some SQL to help analyze the situation:

This SQL Statement will determine what the top business objects and association types are in the ibs_spec_assignments table:

select spec_class_type, ass_type, count(*) as cnt
from ibs_spec_assignments
group by spec_class_type, ass_type
order by cnt desc

You may be able to narrow down some records to clean up from there. Also, you may want to find out what exists in the ibs_spec_assignments table that is no longer in the ibs_spec table:

select count(*) from ibs_spec_assignments where spec_id not in (select spec_id from ibs_spec)

and

select count(*) from ibs_spec_assignments where ass_spec_id not in (select spec_id from ibs_spec)


I removed the cleanup script that I used at the request of TRIRIGA. You can send me an email if you need more information.


Friday, January 18, 2008

Worklfow Cleanup Script

The development group at TRIRIGA indicated that they have seen performance issues on workflow cleanup once you get a lot of instance data in the tables. It is recommended you use SQL to clean it. Below is a script TRIRIGA developement wrote to do the clean up (works on Momentum 2.5 and above platform).
Bring down the process server to kill the clean process, then run this script. It will prompt for a date, you should use today or yesterday to clean as much as possible.

-- Workflow Instance information is in the following tables:
-- (many also have template information - so they can't simply be truncated)
--
-- TASK
-- TASK_ASSIGNEES
-- TASK_FILTER
-- TASK_RESULT_LIST
-- WF_CALLWF_RESULT_LIST
-- WF_CONDITION
-- WF_TEMPLATE_STEP
-- OBJECT_TYPE_MAP
-- WF_STEP_INST
-- WF_INSTANCE
-- WF_TEMPLATE
--


-- Select WF Instances that ended at a point older than 'date' and are not active or waiting for a user.
-- SELECT WFT.* FROM WF_TEMPLATE WFT WHERE WFT.TEMPLATE_FLAG = 0 AND WFT.STATUS_ID NOT IN (40,60,61) AND WFT.END_DATE < version =" 0" version =" 0" wf_template_id =" WFT.WF_TEMPLATE_ID" template_flag =" 0" wf_template_id =" WFT.WF_TEMPLATE_ID" template_flag =" 0" wf_template_id =" WFT.WF_TEMPLATE_ID" template_flag =" 0" wf_template_id =" WFT.WF_TEMPLATE_ID" template_flag =" 0" wf_pid =" WFT.WF_TEMPLATE_ID" template_flag =" 0" wf_template_id =" WFT.WF_TEMPLATE_ID" template_flag =" 0" wf_template_id =" WFT.WF_TEMPLATE_ID" template_flag =" 0" wf_template_id =" WFT.WF_TEMPLATE_ID" template_flag =" 0" wf_inst_id =" WFT.WF_TEMPLATE_ID" template_flag =" 0" wf_pid =" WFT.WF_TEMPLATE_ID" template_flag =" 0" template_flag =" 0">


Suspending Workflow Instance Logging

Saving workflow instances is helpful for troubleshooting, but can be a drain on system performance. This is espceailly true when migrating a large amount of data into your system. The scripts below can be used prior to a data migration to turn workflow instance logging off for all workflows during a data migration (saving your current settings) and restoring workflow instance logging afterward.

--------------------------------------------------------------------------------------------
-- ****************** BEFORE MIGRATION *********************------
-- Create backup of instance_data_flag values ------
-- Update all templates to set instance flag = false ------
--------------------------------------------------------------------------------------------

create table wf_template_bk as (
select wf_template_id, wf_template_version, instance_data_flag from wf_template where template_flag = 1);

create table wf_lookup_bk as (
select wf_template_id, wf_template_version, instance_data_flag from wf_lookup);

update wf_template t1 set t1.instance_data_flag = 0 where t1.template_flag = 1;
update wf_lookup t1 set t1.instance_data_flag = 0;
------------------------------------------------------------------------------------------
-- ****************** AFTER MIGRATION **********************------
-- Update from backup of instance_data_flag values ------
-- Drop backup tables ------
------------------------------------------------------------------------------------------

update wf_template t1 set t1.instance_data_flag = (select t2.instance_data_flag from wf_template_bk t2 where t1.wf_template_id = t2.wf_template_id and t1.wf_template_version = t2.wf_template_version)
where t1.template_flag = 1;

update wf_lookup t1 set t1.instance_data_flag = (select t2.instance_data_flag from wf_lookup_bk t2 where t1.wf_template_id = t2.wf_template_id and t1.wf_template_version = t2.wf_template_version);
drop table wf_template_bk;
drop table wf_lookup_bk

Monitoring the Cleanup Agent

If you want to track the progress of the cleanup agent you can use log4j to create a cleanup.log. Open the log4j.xml file in the config directory and add the following under the existing appender:














Be sure to set the File Value parameter to point to the directory you want the log file to be created in. You should probably use the same path as the existing appender in the log4j file. You will probably need to restart Jboss/Weblogic for the change to take affect. The result will be a cleanup.log file that will show you messages similar to the ones below:

DEBUG DBAnalyze - Clean up - Setting Cleanup Batch size to 900
2008-01-17 20:04:41,963 DEBUG Deleted 900 runtime process only WF Templates and associatated data
2008-01-17 22:13:48,040 DEBUG Deleted 900 runtime process only WF Templates and associatated data

Wednesday, January 16, 2008

Analyzing Long Running SQL Queries

If you are able to identify long running SQL queries in your database and want to trace them back to the report or query in the application that caused the SQL to be generated, use the following method:

Consider the following SQL statement:

select t1.spec_id ,t1.SYS_TYPE1 , t1.triUserMessageFlagTX , t1.triIdTX ,
t1.triNameTX , t1.triIdTX , t1.triNameTX , t1.triRequestedByTX , t1.triCreatedSY ,
t1.triStatusCLOBJID from
M_TRIREQUEST t1 where t1.SYS_OBJECTID>0 and t1.SYS_TYPE1 = 1.0002583E7
and t1.SYS_PROJECTID = 1.0
and isnull(t1.triStatusCL, -9223372036854775808)
not in ('Retired','Upload Error','Template', 'History', 'Deleted')
and t1.spec_id in ( select ta1.ass_spec_id from ibs_spec_assignments ta1
where ta1.spec_id in (208133)
union select ta1.spec_id from
ibs_spec_assignments ta1 where
ta1.ass_spec_id in (208133) )
order by upper(t1.triIdTX) ,upper(t1.triNameTX) ,upper(t1.triIdTX) ,upper(t1.triNameTX)

Take a fairly unique column, triRequestedByTX for example, and find all the places that is being used in conjunction with the table/view name: M_TRIREQUEST by running the following SQL query:

select * from REP_TEMPLATE_COLUMNS where field_name = 'triRequestedByTX'
and table_name = 'M_TRIREQUEST'

Then, select the rep_template_id and find the names of the reports from the rep_template_hrd table

select rep_name from rep_template_hdr where rep_template_Id in (17879,25749)

This could be done in one statement like:

select rep_name, rep_template_id from REP_TEMPLATE_HDR where rep_template_id in
(
select distinct rep_template_id from REP_TEMPLATE_COLUMNS
where field_name = 'triRequestedByTX'
and table_name = 'M_TRIREQUEST'
)

Now, open TRIRIGA, go into report manager, click on “open” for any report. Press CTRL-N to open a new window, where you can edit the URL. Replace the report template id in the URL with the one provided above and the offending report will open.

Once you find the offending report you may want to reconsider how you have your filters set up, or you may want to remove it from a portal section or frequently used object. An alternative to removing it is to just remove it from the first tab of an object so it does not run as often.

Wednesday, January 9, 2008

WF_STEP_INST Table

The WF_STEP_INST table in the tririga schema basically holds the structure information for the actual workflow instances, i.e. all of the tasks and how they relate for all of the running workflows. This data will clean at runtime if you are in production mode. You shouldn't ever have too much data in this table as it should only contain the data for:

1) active workflows
2) workflows with pending tasks (e.g. user tasks)

Unless you have tons of workflows with user tasks that are leaving the workflows active (and hence the instance data in the system) or have a lot of orphaned data, this table shouldn't have too many rows. Removing the check next to Save Instance Data in workflow start tasks reduces the number of rows inserted (and eventually deleted) from this table. This can be important when doing large data loads, as I have seen delete operations against this table become a bottleneck when the system is processing large quantities of workflows after a data load.

Monday, January 7, 2008

Tracking Data Integrator Loads

This SQL script (run against the tridata schema) will provide all Data Integrator loads, the status of the load, and which user initiated the upload:

select ibs.created_date "STARTED_DATE",
ibs.updated_date "FINISHED_DATE",
ibs.object_state "STATUS",
mp.sys_useraccount "USER_ACCOUNT",
ibs.spec_name "UPLOAD_NAME"
from ibs_spec ibs, t_myprofile mp
where ibs.spec_template_id = 106548 and
ibs.created_by = mp.spec_id
order by 1 desc;

Friday, January 4, 2008

Scripts to Identify Object Changes

The scripts below show changes made to to various types of objects within the last day. These can be modified to show a date range or release date.


-- Business Objects (run in tridata schema)
select Updated_Date, s.name
from ibs_spec_type s
where updated_date > SYSDATE -1
order by s.name;

--Workflows (run in tririga schema)
select Updated_Date, data_value Module, wf_name
from wf_lookup w, list_value m
where updated_date > SYSDATE -1
and m.LIST_ID = 2
and m.list_value_id=w.module_id
order by m.data_value, w.wf_name;

--GUIs (run in tririga schema)
select Updated_Date, data_value Module, g.gui_name
from gui_header_publ g, list_value m
where updated_date > SYSDATE -1
and m.LIST_ID = 2
and m.list_value_id=g.spec_class_type
order by m.data_value, g.gui_name;

--Queries (run in tririga schema)
select Updated_Date, data_value Module, r.rep_name
from rep_template_hdr r, list_value m
where updated_date > SYSDATE -1
and m.LIST_ID = 2
and m.list_value_id=r.class_type_id
order by m.data_value, r.rep_name;

--Managers which have been modified
select Updated_Date, name Manager_Name from ibs_manager_description
where updated_date > SYSDATE -1
order by name;

--Portals (run against tririga schema)
select distinct Updated_Date, portal_name from user_portal
where updated_date > SYSDATE -1
order by portal_name;

--Portal Sections (run against tririga schema)
select updated_date, portal_section_name from portal_section
where updated_date > SYSDATE -1
order by portal_section_name;

Diplaying UOM Values on Form Reports

If you are creating a Form report that includes a Number field and want to display the UOM value of the Number field in the form, please use the method shown below. When a number field with a UOM is created, the system automatically creates a new field and suffixes the _UOM to the Number field name. This UOM filed is not listed in the GUI and so when you export the Report Header there is no way to include the _UOM filed. You have to use this method:

Thursday, January 3, 2008

System Usage Script

If you would like to know who has logged into your TRIRIGA system within the past 30 days and when, here is a script you can run against the tririga schema. You can update this script to filter for specific users or filter specific users out (such as data integration users).

SELECT DISTINCT
user_credentials.user_account AS USERNAME,
session_history.start_time AS LOGIN
FROM
user_credentials, session_history
WHERE
user_credentials.user_id = session_history.user_id AND
session_history.start_time > (sysdate - 30)
UNION ALL
SELECT DISTINCT
user_credentials.user_account AS USERNAME,
active_session_user.start_time AS LOGIN
FROM
active_session_user, user_credentials
WHERE
user_credentials.user_id = active_session_user.user_id
ORDER BY 1, 2;

External Redirect

If you would like to hyperlink from an external web page to a specific form within TRIRIGA, use the format shown below. For example, you may want to have a link from your corporate intranet directly into TRIRIGA 9 the pops up a blank request form. This works very nicely in conjunction with single sign on.

To get the URL to the form you would like to use simply go the the form in the application (i.e., open a new request), push the F11 key on your keyboard and copy the URL to your clipboard. It will look like the first example below. Then all you need to do is insert http://your_servername/html/en/default/admin/extRedirect.jsp?url= before the /WebProcess.srv?.... portion of the URL.

Instead of this:

http://your_servername/WebProcess.srv?objectId=750000&actionId=750011&propertyId=208133&projectId=1&specClassType=13&specTypeId=10001791&managerType=list&guiId=10001791

Link to this:

http://your_servername/html/en/default/admin/extRedirect.jsp?url=/WebProcess.srv?objectId=750000&actionId=750011&propertyId=208133&projectId=1&specClassType=13&specTypeId=10001791&managerType=list&guiId=10001791

Wednesday, January 2, 2008

Hosting HTML Files Locally

When developing in TRIRIGA you occasionally come across the need to point to an external html file from within the application. For example, you may want to create a detailed help screen in html and link to if from a section action within the application. Instead of posting that html page to another web server, you can host it directly on your TRIRIGA application server.

All you have to do is copy your html file to the /userfiles/company-1 directory and set your URL to look like this:
/getCompanyFile.jsp?fileLoc=//Company-1/myfile.htm

Another Method for Data Cleanup

This post was removed at the request of TRIRIGA....sorry

Data Cleanup Tool

Here is the link to the newest data cleanup tool for TRIRIGA:

http://ServerName:8001/html/en/default/datadictionary/dataCleanupUtility.jsp

Be very careful using this tool as it can cause some serious damage to your system. Here are a few comments from internal TRIRIGA sources:

“I would DEFINITELY make an export (backup) before running these scripts. They aren’t supported by Platform, so use AS IS (and I wouldn’t give them to a customer without permission).”

“Ok, so the cat's out of the bag so to speak, but please please please do not consider these clean scripts ready to be used without a thorough understanding of the scripts and the results they will have on your system. I encourage you to please contact a PA/TA to walk through these before you employ them. To xxxx's comment, obviously do not share with clients without explicit agreement from TS leadership and the PM on the project. These will delete pretty much all instance data in your system...please tread lightly.”

"Rumor has it that QA plans to support and document this utility….eventually. But, no word yet on when that may happen. So, unfortunately, everyone is on their own. You might try calling tech support and asking about it, but you’ll need to get past the first level tech support people….who will probably refuse to even talk to you about it."

Get a Count of all Objects in TRIRGA 9

The script below can be used to get a list of all objects in your TRIRIGA database, showing the objects with the greatest record counts first. This can be useful to identify which records are consuming all of the space in your database, especially ones that have no business value for you.
This script should be run against your tridata schema.


SELECT
spec_template_id,
type_name,
COUNT(1) COUNT
FROM
ibs_spec
GROUP BY spec_template_id, type_name
ORDER BY 3 DESC