Announcement:
wanna exchange links? contact me at sapchatroom@gmail.com.
Posted by
Admin at
Reply from sviszt on Jan 5 at 4:32 AM Hi Maruthiv, 1) Create a new generic datasource as usual with function module behind. 2) The function module needs to implement another interface than what you normally use, see below. 3) Edit entry for datasource in table ROOSOURCE: a. Field DELTA: AIED b. Field EXMETHOD: F1 (instead of F2 which is the default after you created the extractor with transaction SBIW) c. Field RESETDELTA: Function module for resetting delta, see documentation of that field. Function module interface: When you create your new datasource the usual way, it is based on the simplified interface, as in template RSAX_BIW_GET_DATA_SIMPLE. After you change from F2 to F1, you need to use as interface as seen in template RSAX_BIW_GET_DATA. Within the function module you need to implement your own timestamp handling. This means that you will need a kind of status table, where you always save the "biggest" timestamp you have already extracted. During the next delta load, you to read this table, and select only data with timestamps that are larger than (and not equal) to this timestamp. Then you sort the data by timestamp, and again save the biggest timestamp into the table. I would also suggest logging the requests with I_REQUNR and the from and to timestamps into a log table for traceability. Above procedure applies if I_UPDMADE is D for Delta. If it is R for Repeat you need to extract with the timestamp of the last delta, which you could read from your log table. If you have 9 different tables with timestamps, I would suggest burning it down to I "leading" timestamp, and using after-images if possible. This means that with the above logic you decide a from-to interval for the new delta package. Then you consult each of your 9 tables one by one to find any records within this interval. For each table entry, you determine the corresponding document(/item) number, and put these numbers into a list. Then throw away duplicates. Finally get your extraction-relevant data for each document and return the result. This way you should always get the newest status of each document for which any of the 9 tables have been changed. Of course you need to first put this data into an ODS and can not load directly into a cube, because you do not send before-images this way. I hope this helps as a generic guideline. Cheers, Oliver
| | | ---------------Original Message--------------- From: maruthiv Sent: Sunday, January 04, 2015 10:27 PM Subject: Generic Extraction - FM - Delta Logic Thank you oliver for quick response, but in my case I have 9 tables which have different creation dates and different changed dates and that data source is built on function module now I was stuck to find the delta logic for this function module if you suggest any that could be more appreciated. I am waiting for your response. | | Reply to this email to post your response. __.____._ | _.____.__ |