Igor Alekseev is a Partner Solution Architect at AWS in Data and Analytics domain. In his role Igor is working with strategic partners helping them build complex, AWS-optimized architectures. Prior joining AWS, as a Data/Solution Architect he implemented many projects in Big Data domain, including several data lakes in Hadoop ecosystem. As a Data Engineer he was involved in applying AI/ML to fraud detection and office automation. Igor’s projects were in variety of industries including communications, finance, public safety, manufacturing, and healthcare. Earlier, Igor worked as full stack engineer/tech lead.
May 26, 2021 03:15 PM PT
Ever wanted to get the low cost of a data lake combined with the performance of a data warehouse? Welcome to the Lakehouse. In this session learn how you can build a Lakehouse on your AWS cloud platform, using Amazon S3 and Delta Lake. Integrate with Amazon Glue and make the content available to all of your AWS services like Athena and Redshift. Learn how other companies have created an affordable and high performance Lakehouse to drive all their analytics efforts.
November 17, 2020 04:00 PM PT
How are customers building enterprise data lakes on AWS with Databricks? Learn how Databricks complements the AWS data lake strategy and how Databricks integrates with numerous AWS Data Analytics services such as Amazon Athena and AWS Glue.
Speakers: Denis Dubeau and Igor Alekseev
June 23, 2020 05:00 PM PT
Predicting the movements of price action instruments such as stocks, ForEx, commodities, etc., has been a demanding problem for quantitative strategists for years. Simply applying machine learning to raw price movements has proven to yield disappointing results. New tools from deep learning can substantially enhance the quality of results when applied to traditional technical indicators rather than prices including their corresponding entry and exit signals. In this session Kris Skrinak and Igor Alekseev explore the use of Databricks analysis tools combined with deep learning training accessible through Amazon's SageMaker to enhance the quality of predictive capabilities of two technical indicators: MACD and Slow stochastics. We use the S&P 500 as a baseline for prediction. Then we explore the capabilities of optimizing the statistical parameters of these indicators First, followed by hyper parameter optimization of the deep learning model deepAR. The session will illustrate how to build such indicators in Databricks notebooks and extend Databricks' functionality to train deep learning models in the cloud via PySpark and Amazon SageMaker. No prior experience is required.
June 23, 2020 05:00 PM PT
How are customers building enterprise data lakes on AWS with Databricks? Learn how Databricks complements the AWS data lake strategy and how HP has succeeded in transforming business with this approach.