Automating Loads

I believe that this is a new function within BW 3.0..  The DataSource now have an option "DataSource transfers double data records" on the "Processing" tab.  Then there is  a flag to "Ignore double data records".

Not sure of the workaround for lesser versions of BW.

Two Topics: Automating your InfoCube deletes, and using an ODS to track deltas.

TOPIC 1
-------
If you do not COMPRESS the current fiscal period's data in your target cube (which eliminates the Request ID,) you can automate the process of deleting the prior request when you reload your data every day. You accomplish this through configuration on the InfoPackage with which you load the data:

A) Go in to InfoPackage maintenance

B) Click the Data Targets tab.

C) For your selected data target, click the icon which looks like the Greek sigma, under the column labeled "Automatic loading of  similar/identical requests in the InfoCube." You can also type DELE in  the command field. A popup window entitled "Deleting Request from InfoCube after Update" will appear.

D) On the popup window, you have the option of selecting "Always delete existing requests if delete conditions are found" and "Only for the same selection conditions." If you select these options, then BW will automatically delete (or reverse if you have aggregates) any prior request for the same selection criteria which has not been compressed.

You can configure your InfoPackage in this manner, and then compress the request when you go to a new fiscal period and no that you will not load any more data for the prior fiscal period.

TOPIC 2
-------
It sounds like you may be reloading every day in order to catch changes to the data. You may know this already, but just in case: You can use an ODS object to capture deltas, and then send only the changes to your cube. This eliminates the need to delete the prior request from your cube, and simplifies the use of cube compression and aggregates.

You then would load the current period's full set of data into your ODS every day, and the built-in functionality of the ODS object would detect the differences and send only these on to your target cube. You can read more about this scenario in the white paper at service.sap.com/bw entitled "Data Staging Scenarios."

Finally, using an ODS object as the data staging area eliminates (I think) your issues with the PSA. Instead of having your application read the PSA, have it read from an ODS object instead. Every ODS object has a unique key, so you won't get duplicate records as you can with a PSA. You can also report on ODS data in BEx queries, if you have any need to, which is yet another advantage this method has over the use of PSA.

We use Cube 0COOM_CO1, which has a time characteristic of fiscal period. We manually load the current period into this cube daily.  But to ensure that we do not have duplicate data in the cube we manually delete the previous day's request before loading the period again.  There has to be an easier way to do this, any suggestions on how to automate this process?

Also, the same with the PSA.  We have an application reading the PSA, and to avoid duplicate records in the PSA, we are deleting the PSA load before loading it again the next day.  Unfortunately with the PSA, the only way to delete a specific request that I can find is to go into the request and mark it as a status of NOT OK, then delete all requests with errors.

You can automate the data deletion process from the InfoPackage. If you look under the "Data Targets" tab of the InfoPackage, there is a Checkbox to "delete the entire content of the data target". Setting this checkbox will ensure that the data is deleted before the new load.

About the PSA, there is feature by which data can be deleted from the PSA.
Here are the steps for this -
1. Go to the PSA menu under the Admin. Workbench.
2. Navigate to the DataSource for which you want to the set the deletion criteria.
3. Highlight the DataSource and right-click. You will see "Delete PSA Data" in the context menu. Select that.
4. On the next screen, you can now maintain the deletion parameters. The deletion of PSA data can be done based on Date or Days. You can also select successful & error requests for deletion.
5. Create a background job after setting the deletion parameters and schedule it.

SAP BW Books:
SAP Business Warehouse, Interview Questions , Certification and Configuration Books

Back to:
SAP BW (Business Warehouse) Hints and Tips

Return to :-
SAP ABAP/4 Programming, Basis Administration, Configuration Hints and Tips

(c) www.gotothings.com All material on this site is Copyright.
Every effort is made to ensure the content integrity.  Information used on this site is at your own risk.
All product names are trademarks of their respective companies.  The site www.gotothings.com is in no way affiliated with SAP AG.
Any unauthorised copying or mirroring is prohibited.