We at Sierra-Cedar have been working on the Campus Solutions/HRMS database “split” with our customers now for nearly two years. We are proud that we have 16 live as we close out 2012. I believe this is more than any other system integrator. Our first customer to go live, Palomar College, has been live for well over a year and is doing well, which I witnessed firsthand when I visited them in December. All of our customers have used Oracle-delivered integration models, with the majority selecting the “subscriber only” model. We have learned many important lessons from our customers while they implemented the split, as well as after they went live and were faced with supporting two databases and technical environments. This blog shares what I think are the top three leading practices for implementing and governing a split environment:
- Clean up your data and custom interfaces into PeopleSoft.
- Test well before and after going live.
- Govern your integration environment well.
Each of these best practices is discussed below.
Data Clean Up
To appreciate the significance of maintaining valid data requires understanding how the delivered Oracle Integration works. Oracle split integration is based on several Integration Broker Asynchronous Messages. The main message is the “Person Basic Sync” Message, PBS, which contains person bio demographic data. Whenever this type of data is changed a PBS message is published with the new values. The subscribing system processes the message and updates its version of the data accordingly.
A major component of most split projects is ensuring that PBS messages are published whenever bio demographic data is modified. For the delivered features it is Oracle’s responsibility to ensure the message is published. For custom software, it is the customer’s responsibility. Custom email loads, address updates, and admission application loads are common examples of integration points that insert or update data that is integrated using PBS, and they are often “message enabled” to support the split.
The point I want to make in this blog is that the work does not stop here. It is also very important that new and existing data is valid. This pertains to any data that was originally converted as well as data that is added through any integrations points that you maintain. I cannot think of a single institution I have helped with the split that has not had to remedy some missing validation. If the custom programs that insert or update person data load invalid data that won’t pass the delivered Oracle validation, the subscription process will fail and result in out-of-sync data. Using a component interface, which validates the data, will reduce or eliminate this problem. Unfortunately many, if not most, institutions do not use component interfaces, so they generally have some invalid data in their system that must be addressed for the split integration points to function properly.
A less obvious source of integration failures is invalid or corrupt existing data in the system. The resulting errors can be very misleading because often the error is not related to the data being updated. For example, an invalid email address associated with a person can cause a phone number change to fail. The most common issues have been:
- Invalid primary email and phone flags – none set or more than one set
- Blank require fields – including EFF_STATUS, which I mention because it results in no error but the data is not updated
- Invalid prompt – invalid name suffix and prefix are very common
- Orphaned data – this can cause all sorts of odd behavior, but generally will result in an error and out-of-sync data
- Invalid related display – invalid state codes are the most common
These types of problems are so pervasive that I recently developed an automated process to validate customer’s data and have added this step to every Sierra-Cedar split project. Here are the some of the results of running this utility in an Oracle demo database (over 3,000 validation errors- yikes!).
Testing is always important, but is especially important while performing the split and maintaining a split environment. This is the result of so many moving parts and the large amount of configuration settings related to splitting the database. The moving parts refer to every program – Oracle or custom – that updates synchronized data and must publish messages. To work properly each of these touch points must send properly formatted messages whenever data is changed. In addition, as discussed above, the data must also be valid in order to pass through the message handler on the subscribing system. Whenever a message fails because it is never sent or because it contains invalid or incorrectly formatted information, the data will become unsynchronized.
The other aspect of the split that makes testing critical is the large number of configuration settings that are part of the split. It is critical to correctly move all configuration settings from your development environment to your production environment during go-live. I have witnessed too many customers become puzzled during their go-live because their integration was not working properly in production after working perfectly for months in their development and test environments. This problem is always the result of not testing the move to production, which includes migrating these settings and verifying that they are working after the move. The migration process should be as automated as possible, documented, and tested just like any other program during each test cycle. I strongly suggest creating a detailed go-live checklist with every task that will be executed during go-live and testing this list every time you refresh your development and test environments. You should plan to do this a minimum of three times.
In my experience, configuration testing doesn’t end with go-live. I have seen several Oracle patches impact configuration. Watch out for Campus Solutions, HCM bundles and patches and especially PeopleTools patches and upgrades. I recommend running basic integration tests, at a minimum, after applying Oracle maintenance, in order to ensure that the split integration points have not been not impacted.
When testing the split integration points, be very careful when verifying the results of your tests. In many cases, the integration points will corrupt data not related to the information changed in a particular transaction. For example, an email change may result in all of the person’s effective dated history (PERS_DATA_EFFDT) being deleted. Moreover, there are a number of circumstances that will result in a message status of “Done” (i.e., no errors), but the data will not actually be updated. After further investigation, you may or may not find an error message buried in the logs. The ONLY way to detect these errors is to verify all of the synchronized tables whenever data is changed. I do this with the automated verification utility that we developed at Sierra-Cedar. I suggest using this or something similar in order to ensure thorough data verification.
In summary, I recommend that you:
- Test well during your normal development cycle.
- Verify the integrations during the go-live outage.
- Test well each time Oracle maintenance is applied.
In addition, I suggest that your test plan contain the following use cases:
- Create New Person data from CS and HCM. Be sure you cover ALL of your custom code.
- Update Existing Person bio/demo data from CS and HCM. Include cases that include: effective dated changes, corrections, and blanking out existing values.
- Create and update synchronized setup data, which is typically a huge list. The most important and heavily utilized include job code, position, and department setup tables.
Many institutions are surprised by the effort required to maintain a split environment. The first impact occurs when they go live and refresh any development and test environments. They now must refresh twice as many databases and should ensure the databases are in sync, Integration Broker is setup, and the Integration Broker runtime message tables are synchronized or truncated. These are all new steps for most institutions and are required to support testing integration going forward.
The next challenge is that data often become out-of-sync soon after go-live. This can result from custom programs that load or update person data directly inserting invalid data, system crashes, and message errors from invalid existing data. The best advice I can give is to run the Verification Engine often – to determine what is out of sync and correct the condition. Directly SQL updating the data is often required to correct the condition. If you are a “subscriber only” institution and are not sure which version of the data is correct, a good first step is to only update student data (no job record). This usually clears up 99% of the errors. Once the data is cleaned up, the next step is to find the root cause and resolve it.
Another consideration is ongoing patching and updates to customizations. Remember that anytime you update or extend a custom program that updates person data you must ensure that the required messages are published. This requires testing the integration. In fact, you should plan to test the integration any time you change custom programs that are impacted by the split as well as each time you apply Oracle patches and bundles.
Keep in mind that Oracle maintenance may have co-requisites. If Oracle delivers Integration Broker objects, they may overwrite your version. For this reason, I recommend that, at a minimum, institutions test the split integration as I described in the previous section.
If you do some basic testing you should catch the majority of problems introduced by applying Oracle maintenance.
The last challenge that comes to mind is general system performance. While the split integration has performed well in general, the extra load of running two databases and PeopleSoft technical environment does require more computing resources. For most customers this means additional hardware.
On January 17th at 2:00 pm EST, I will be delivering a live webinar via the HEUG titled, Campus Solutions / HCM Instance Separation Part 2: Deep Dive Into Utilities. The session will focus on technical impact analysis and remediation including a deep dive into the CedarCrestone Split Utilities, which have saved colleges and universities countless hours by quickly finding and safely split-enabling their customizations. Registration is free to HEUG members.
I hope this is helpful. Good luck with your split project.