VantageCloud Lake Demos Public Repository.
Purpose is to store all the public lake demos here in a single project where the community can collaborate.
Available Demos List
Python Notebook
Three files required:
- Environment variables file vars.json
- 0_Demo_Environment_Setup_Automation.ipynb
- 1_Load_Base_Demo_Data.ipynb
Alternate - use Apache Airflow
- Upload Demo_Setup_Airflow_Python.py to Airflow
- Edit vars.json and upload as "Variables".
- Execute the DAG
- Run 1_Load_Base_Demo_Data.ipynb
To initiate the configuration of the environment used for these demos, perform the steps in "Environment Setup Automation"; either by running the Jupyter Notebook or the Airflow DAG. Prior to running these scripts, perform the following:
- Edit vars.json to reflect the target environment
- Validate other environment and hierarchy settings in vars.json
- Clusters are set up to be active during nominal business hours USA TIME. Adjust as necessary in the notebook or DAG
- If using Airflow, upload the new vars.json to Variables in Airflow Admin Screen
- When the setup is complete, use the Admin notebook to check cluster status, suspend/resume as needed
- Takes some environmental declarations (users, databases, etc.) from the json file
- Uses US BUSINESS HOURS for Clusters active time. Adjust if needed.
- GRANTs to retail_sample_data for the DEMO_AUTH_NOS to all objects
- Creates a Repositories.PubAuth Authorization Object for accessing open object stores.
- Creates two databases; "demo" and "demo_ofs" each with default NDS and OFS storage respectively.
Per the design, SYSDBA is the account DBA, CGADMIN is Compute Group Administrator, users are in the Business Users Profile.
Python Notebook
Purpose is to load minimal data to the local Lake system to run the base demo notebooks
- Log in as SYSDBA
- Loads two dimension tables to BFS storage from S3
- demo.Customer_BFS
- demo.Accounts_Mapping_BFS
- Loads one fact table to OFS Storage from S3
- demo_OFS.Txn_History
Vantage SQL Kernel
- Log in as CGADMIN/password
- Compute Group Status
- RESUME/SUSPEND/DROP
- DBC login in case one needs DBC
Vantage SQL Kernel
- Create OFS Table from S3 "CashApp" transactions
- Create Foreign Table from S3 "Banking History"
- Review Tables - Dimensions in BFS, CashApp in OFS, Banking History in S3
- Execute Joins and Analytics:
- Identify Customers who have experienced Fraud
- Show the victim's full behavioral path through their Banking relationship
- Execute Joins across the Query Fabric (QueryGrid)
Python Notebook (python 3.8)
- Credentials and UES URI inherited from vars.json
- Create custom container - install libraries and versions
- Upload model and scoring script
- Execute Feature Engineering - pass it to scoring.
- Evaluate Model
Appendix Section - Create the model
- OneHotEncode
- Test/Train Split
- Train Model
- Test Model
- Confusion Matrix
VantageCloud Lake Fundamentals
Notebooks illustrating the feature/function basics
See README for more details
Fundamentals/Native-Object-Store/NOS_Fundamentals_SQL.ipynb
Demos in UseCases Folder
Each Use Case has its own data loading notebook. Typically, the data is loaded from an S3 bucket; bucket name and any credentials are inherited from vars.json file.
See README for more details
UseCases/Native-KMeans/KMeans_Clustering_Python.ipynb
UseCases/Native-GLM-Regression/Regression_Python.ipynb
UseCases/Native-Sentiment-Analysis/Sentiment_Analysis_Python.ipynb
UseCases/Churn-Prediction-OAF/Churn-Prediction-OAF.ipynb
UseCases/Scaling/Demo 1 - Generate Workload.ipynb
UseCases/Scaling/Demo 2 - Real-Time Monitoring.ipynb
UseCases/Scaling/Demo 3 - System Monitoring Queries.ipynb
UseCases/Proximity-To-Climate-Risk/Proximity_To_Climate_Risk.ipynb
UseCases/Vector-Embeddings-Segmentation/Segmentation_With_Vector_Embedding.ipynb