Se ha denunciado esta presentación.
Utilizamos tu perfil de LinkedIn y tus datos de actividad para personalizar los anuncios y mostrarte publicidad más relevante. Puedes cambiar tus preferencias de publicidad en cualquier momento.

Client Success Story - Oracle FDMEE is the Cloud Data Hub at Legg Mason

16 visualizaciones

Publicado el

rise Edition (FDMEE) to automate movement of metadata and data between ground and EPM Cloud. We will have a detailed discussion on the makeup of the FDMEE server configurations, allowing for invoking automated jobs to pull dimensions from an on-premises Essbase ASO application, allowing for custom scripting to invoke EPM Automate jobs to build dimensions and move data between different pods, finally allowing for movement of data in and out of the Oracle Profitability and Cost Management Cloud Service (PCMCS) application. A live demo will be used to illustrate various components of this interesting FDMEE-driven EPM process management architecture.

Publicado en: Tecnología
  • Sé el primero en comentar

  • Sé el primero en recomendar esto

Client Success Story - Oracle FDMEE is the Cloud Data Hub at Legg Mason

  1. 1. Oracle FDMEE is the EPM Cloud Data Hub at Legg Mason Tuesday, 6/25/19 10:00AM – 11:00AM #211, Level 2 Seema Shah Regional Manager Corey Munch AVP
  2. 2. Who We Are Project Overview Staging Area Overview Use of FDMEE Agenda Gotchas/Challenges Next Steps/Improvements Questions?
  3. 3. Ranzal & Alithya join forces in November 2018 to offer EPM, ERP, & Analytics solutions Proven business analytics leader with a history of successful implementations & growth 20+ years of implementations Advisory Services Implementation Services Technical Services Hosting & Support Training Services Intellectual Property > Seema Shah > Regional Manager > 14+ Years with Alithya > Integration Focused For 10+ Years > Washington, DC > seema.shah@alithya.com
  4. 4. • Corey Munch • AVP, Manager/Technical Expert - Business Data Solutions • 6+ Years with Legg Mason • Baltimore, MD • cnmunch@leggmason.com Guided by a mission of Investing to Improve Lives™, Legg Mason helps investors globally achieve better financial outcomes by expanding choice across investment strategies, vehicles and investor access through independent investment managers with diverse expertise in equity, fixed income, alternative and liquidity investments.
  5. 5. Three Primary Project Objectives #1 To provide an enterprise wide solution that consistently measures profitability, in a timely manner, across multiple dimensions: Product, Strategy, Firm, Channel, Vehicle, Location, etc. #2 To create a ‘What If” tool that can assess profitability by changing existing attributes #3 To create a process focused on resource utilization by better understanding costs and related drivers
  6. 6. Essbase, ASO – Assets and revenue by product, strategy and domicile Essbase, ASO – Distribution sourced assets by company and channel Essbase, ASO – Expenses, by project code, for data enrichment Apptio (Flat file) – Tech expenses, allocated by department, for attribution IPD (Flat file) – Distribution expense, allocated by company, for attribution HFM – Expenses Staging Area - Data Sources
  7. 7. Essbase, ASO – Assets and revenue by product, strategy and domicile Essbase, ASO – Distribution sourced assets by company and channel Essbase, ASO – Expenses, by project code, for data enrichment Apptio (Flat file) – Tech expenses, allocated by department, for attribution IPD (Flat file) – Distribution expense, allocated by company, for attribution HFM – Expenses Staging Area - Data Sources
  8. 8. Staging Area - Data Sources
  9. 9. Staging Area - Data Sources Batch file calling MaxL script Staging area – Planning app User input driver data via forms
  10. 10. Staging area – Planning app Staging Area - Data Sources Batch file calling MaxL script User input driver data via forms
  11. 11. Staging area – Planning app Calc script via batch file Staging area – ASO cube Staging Area - Design
  12. 12. Staging area – ASO cubeStaging area – Planning app Calc script via batch file Staging Area - Design
  13. 13. Staging area – ASO cube PCMCS FDMEE Staging Area - Design
  14. 14. Data Flow Diagram PLN: LMStage Cloud Detail ESB: LMPCMSTG LMPCMC R_LMPCMC ESB: PCMRPT
  15. 15. Metadata Refresh PLN: LMStage Cloud Detail ESB: LMPCMSTG LMPCMC R_LMPCMC ESB: PCMRPT 1) Execute Outline Extractor to pull out each dimension individually for the chosen app 2) Create new reformatted files that are in required format for PCMCS dimension updates 3) Execute series of EPM Automate commands to update metadata and redeploy cube
  16. 16. Metadata Refresh Choose PCMCS application: • Calculation Pod • Reporting Pod Execution Options: • Immediate – Online • Immediate – Offline • Scheduled
  17. 17. Batch Scheduling
  18. 18. FDMEE passes variables selected by user to .bat script 1 .Bat script calls outline extractor to export files from on-prem ASO cube for each dimension 2 Perl scripts used to convert dimension files 3 EPMAutomate downloads, deletes and uploads files to/from the profit inbox/outbox 4 Metadata Refresh Script 5 EPMAutomate builds out PCMCS dimensions using previously created files 6 Powershell sends an email to alert admins that the process has completed Occurring throughout: - Logging - File copying/archiving - Error handling
  19. 19. On Premise to Calculation Pod PLN: LMStage Cloud Detail ESB: LMPCMSTG LMPCMC R_LMPCMC ESB: PCMRPT 1) Series of Data Load Rules that break out the data in appropriate groupings run in parallel 2) Execute rulesets after the data loads
  20. 20. On Premise to Calculation Pod Update Default Period Execution Options: • Immediate • Scheduled
  21. 21. FDMEE passes variables selected by user to data load rules and python scripts 1 If selected, PreBatch python script clears out data in calc pod prior to data load via EPMAutomate 2 Individual data load rules are run from FDMEE using source/target settings 3 If selected, PostBatch python script runs aggregation rule in PCMCS via EPMAutomate 4 On-Premise to Calculation Pod – Data Movement Occurring throughout: - Logging - File copying/archiving - Error handling
  22. 22. Calculation to Reporting Pod PLN: LMStage Cloud Detail ESB: LMPCMSTG LMPCMC R_LMPCMC ESB: PCMRPT 1) Run the model if desired, and then export level 0 data to file for upload to on-premise Essbase application. 2) Export current POV using stub snapshot methodology for import to reporting pod. 3) Load Level 0 data files to reporting pod and run a rule to optimize data in the reporting pod.
  23. 23. Calculation to Reporting Pod Select the Fiscal Year to copy Execution Options: • Immediate • Scheduled Select the Scenario to copy Select the Version to copy Confirm if Calculation Rules should run prior to the data copy Confirm if a data extract file should be created for upload to the on premise application
  24. 24. FDMEE passes variables selected by user to .bat/.py script 1 Python script collects and outputs variable file 2 .Bat script called which runs EPMAutomate commands 3 EPMAutomate exports data files, downloads/up- loads between pods and loads data 4 Calculation to Reporting Pod – Data Movement 5 Aggregation rule run in PCMCS to optimize pod 6 Powershell sends an email to alert admins that the process has completed Occurring throughout: - Logging - File copying/archving - Error handling - Loading of on-prem ASO cube
  25. 25. Data Flow Diagram PLN: LMStage Cloud Detail LMPCMC R_LMPCMC ESB: PCMRPT ESB: LMPCMSTG
  26. 26. PCMCS On-Prem Essbase – ASO cube FDMEE Cloud to On-Prem Data Movement
  27. 27. MaxL scripts called from within Calc to Reporting process kicked off in FDMEE 1 Level 0 data from on-prem ASO reporting cube exported and saved 2 On-prem ASO reporting cube dropped and recreated using staging cube 3 Level 0 data loaded back into cube 4 Calculation to On-Prem ASO cube – Metadata Update and Data Movement 5 Fully allocated calc pod data file loaded into cube 6 Aggregate views created to improve performance Occurring throughout: - Logging - File copying/archiving - Error handling
  28. 28. FDMEE Challenges 1 Many moving parts in the process FDMEE is the wrapper and calls other processes occurring elsewhere. Logs are written to a number of different places and must be reviewed in different places 3 FDMEE clogs with too much data Need to break data into smaller slices to ensure successful data loads 2 FDMEE logs are separate from PCMCS logs Need to review FMDEE logs to ensure process success and PCMCS logs to ensure build/load success 4 Must validate success of each data load rule Individual rules may try to run concurrently and result in one or more of them failing. Then must reload individual rules manually
  29. 29. FDMEE Challenges 5 Pulling dynamically calculated data through FDMEE is very slow Stats/KPIs/etc Had to create calc scripts to store data in the Planning app, then push stored data to the ASO cube 7 Processing time Time needed to move data up/down/between can be hours. Need to bake that time into any maintenance estimates 6 Keeping managed infrastructure in sync with cloud Servers are managed by a vendor, some challenges ensuring infrastructure remains in sync with the cloud (Cont’d)
  30. 30. Improvements/Lessons Learned 1 Scheduling FDMEE jobs to run later is beneficial Schedule jobs in sequence once you find the normal amount of time they take to complete or schedule jobs to run at off-peak or off-hours times, just be sure not to bump into the nightly maintenance window 3 Use the member selector in FDMEE windows Typing values directly into the fields in FDMEE windows can result in the correct parameters not being set 2 Make sure you can access the ODI Console May need to kill a job running in FDMEE which must be done through the ODI Console 4 Use “offline” Execution mode Allows you to navigate within Workspace while batch/script is running
  31. 31. Improvements/Lessons Learned 5 Would the processes benefit from using a REST API? (Cont’d)
  32. 32. Q&A Seema Shah Regional Manager Corey Munch AVP

×