Wednesday, May 22, 2013

Security Group Permissions Report

If you need to get a handle on your security groups I have a Crystal Report that displays your security details in a viewer-friendly table format.  The data is set via SQL command, so the where clause at the end of the SQL can be modified to limit the date returned as needed.  You can download the report here.

Wednesday, March 14, 2012

Undocumented Steps for Using BIRT

In order to use BIRT for report development in TRIRIGA, there are a few undocumented steps you will need to take to make it work. Start off by installing BIRT and the IBM TRIRIGA plug-in, using the Report User Guide for as a reference. You will then need to set the IBM TRIRIGA BIRT Preferences.

In the BIRT application menu, go to Window/Preferences and find the IBM TRIRIGA BIRT Preferences. Set the value for your TRIRIGA App Server to "http://servername:8001/remote", and include your username and password. Once this is done, click the "Test Connection" button and verify that you get a "Connected to TRIRIGA" message.





















Another important step is to check the value set for FRONT_END_SERVER in your tririgaweb.properties file. This will typically be "appserver:8001", but may vary in your environment. If this value is not set correctly, the report will not display correctly in the application, you will probably see something that looks like the screenshot below.




Finally, once you have your first report built and packaged as a zip file (as documented in the Reporting User Guide), you will need to modify the archive file before uploading it to document manager. In order for TRIRIGA to read the files they need to be in the root of the zip file, as exported the zip file will not work. The easiest way to do this is copy the files out of zip file, delete any directories in the zip file and copy them back to the root of the zip file. If you don't do this, the report will display properly when you preview it in BIRT, but when you run it from the application you will get an error message.




Tuesday, August 16, 2011

Impact of Changing List Values

A couple of weeks ago I wrote about determining the impact of changing classification values so you can judge the impact of the change before you make a data change you will later regret. Similar to determining the impact of changing classification values, you can also determine the impact of changing list values to determine where else in the application a particular list is being used.

To determine which objects use a specific list, run the following SQL script (substituting the name of the list field you are looking for for the value in ATR_NAME below) :

select name from IBS_SPEC_TYPE
where SPEC_TEMPLATE_ID in (select SPEC_TEMPLATE_ID from IBS_SPEC_VALUE_META_DATA
where ATR_TYPE = 'List'
and ATR_NAME = 'triTaxPaidToLI')

The results of this query will show which business objects have a field (with the name specified in ATR_NAME above) that points to that list. Using this, you can go into the Data Modeler for each object, select the field that uses the list and click on 'Where Used'. This will pop up a window showing all GUI's, queries and workflows where the field is used. Pay special attention to workflows that show 'Workflow Condition' in the Action column - this will help identify workflows that may use specific classification values in their logic.


Wednesday, July 27, 2011

Impact of Changing Classification Values

You may find your users want you to remove/replace some classification values, but before you make the requested changes you need to determine if that classification is used anywhere else in the product and what the impact of changing it will be.

To determine which objects use a specific classification run the following SQL script (substituting the name of the root classification you are looking for for the value in red below) :

select name
from IBS_SPEC_TYPE
where spec_template_id in
(select spec_template_id from ibs_spec_value_meta_data
where atr_type = 'Classification' and
classification_root_name = 'Location Primary Use')

The results of this query will show which objects have a field that points to that classification root value. Using this, you can go into the Data Modeler for each object, select the field that uses the classification and click on 'Where Used'. This will pop up a window showing all GUI's, queries and workflows where the field is used. Pay special attention to workflows that show 'Workflow Condition' in the Action column - this will help identify workflows that may use specific classification values in their logic.

Locating the Module for a Business Object

If you've ever known the name of the business object you were looking for, but could not figure out which module it is located in and didn't feel like searching every module in the GUI builder, below is a script to help you out. The IBS_SPEC_TYPE table provides a list of all business objects and the IBS_MODULE table give a list of all modules, so a simple SQL script can join these two tables and give you your answer. For a list of all objects and modules use this:

select O.NAME, M.MODULE_NAME
from IBS_SPEC_TYPE o, IBS_MODULE m
where O.SPEC_CLASS_TYPE = M.MODULE_ID
order by M.MODULE_NAME, O.NAME

If you just want to get the details for a specific object you can use the script below, substituting the value in red for the name of the BO you are looking for.

select O.NAME, M.MODULE_NAME
from IBS_SPEC_TYPE o, IBS_MODULE m
where O.SPEC_CLASS_TYPE = M.MODULE_ID
and O.NAME = 'triREPaymentAdjustment'
order by M.MODULE_NAME, O.NAME

Thursday, May 13, 2010

Crystal RAS on a 64 bit Server

I ran into a situation recently where I experienced some difficulty getting Crystal RAS 2008 to work properly when it was installed on a Windows Server 2008 R2 (x64). The first issue appeared when we installed the ODBC drivers but they did not appear in the ODBC Data Source Administrator. It turns out that in 64 bit versions of Windows there are actually two ODBC Data Source Administrators, one for 64 bit drivers and one for 32 bit drivers, however only the 64 bit Administrator is available in the Control Panel/Administrative Tools and the drivers we were using were 32 bit. To access the 32 bit administrator, we needed to manually find and run the following file: C:\Windows\SysWOW64\odbcad32.exe . Now we could see the ODBC drivers we installed.

Once we figured out why the ODBC drivers we not showing up in the ODBC Data Source Administrator and worked through that, our next issue appeared. We were able to create System DSN’s (using the Oracle driver) and verify that they were set up correctly using the “Test Connection” feature, but for some reason every time we ran a report we would get an error message: Failed to open the connection. Details: [Database Vendor Code: 12154]. This error message translated into ORA-12154: TNS could not resolve the connect identifier specified.

After some extensive research, it turns out that this was a fairly common issue for people using 32 bit ODBC drivers on a 64 bit Windows OS. I actually confirmed the same error message (ORA-12154) by installing Crystal Designer on the server and running a report directly from Crystal using the same ODBC connection (thereby bypassing the RAS server). As it turns out, the parentheses in the default file location, C:\Program Files (x86), cause problems with the ODBC drivers. All we had to do to fix the problem was install RAS in a directory that did not contain parentheses – I put it in C:\CrystalReports. Finally after much aggravation, RAS was working.

Thursday, March 4, 2010

External Mail with Attachments

I recently had a request from a client to add an action to Contact Center that emails the 'Requested By' person a PDF explaining how to use the ESS application. The same basic method can be used to email just about any file to anyone with a People record in TRIRGA (or even someone who does not have a people record if you create a custom GUI to enter the email address).

Anyhow, here are the basic steps:
  1. In the Application Setup manager there is a business object called Offline Content. Add a new record in the BO, giving it a Name and ID and attaching the file you want to send to the Offline Content field
  2. Create a new query that returns only the offline content record you just created - this will be used in the workflow below
  3. Create a new workflow that first runs the query above to get the content and then retrieves the person you want to email.
  4. Once you have a handle on the content and person (or at least their email address), add a Create task to create a Mail/Email Message and map (or hard code) the Body and Subject fields.
  5. Next, add a Create task to create a Mail/Email Address record that will be your From address (you must have a To and From email address). Map or hard code the Address and Name fields
  6. Add two Associate tasks, both using the string 'Email From Address', to associate the From address to the Email Message
  7. Repeat steps 5 & 6 for a To address, this time using the string 'Email To Address' for your Associate tasks
  8. Now, add a Create task to create a Mail/Email Attachment. In this step map to the Content field from the triOfflineContentBI field on the Offline Content record (from the Query task in step 3)
  9. Add one more Associate task the associates the Email Message to the Email Attachment using the string 'Email Attachment'
  10. Finally, use a Trigger Action task to trigger SEND on the Email Message
An decent example of sending an email message with binary content exists out of the box in the current TRIRIGA application. Refer to the triRETransactionPlan - Synchronous - Send Offline Transaction Template workfow in the Project module for reference.