Bas is a programmer, scientist, and IT manager. At ING, he is responsible for the Fast Data chapter within the Analytics department. His academic background is in Artificial Intelligence and Informatics. His research on reference architectures for big data solutions was published at the IEEE conference ICITST 2013. Bas has a background in software development, design and architecture with a broad technical view from C++ to Prolog to Scala and is a Spark Certified Developer. He occasionally teaches programming courses and is a regular speaker on conferences and informal meetings.
ETL has been around since the 90s, supporting a whole ecosystem of BI tools and practises. While traditional ETL has proven its value, it's time to move on to modern ways of getting your data from A to B. Since BI moved to big data, data warehousing became data lakes, and applications became microservices, ETL is next our our list of obsolete terms. Spark provides an ideal middleware framework for writing code that gets the job done fast, reliable, readable. In this session I will support this statement with some nice 'old vs new' diagrams, code examples and use cases. Please join if you want to know more about the NoETL paradigm, or just want to be convinced of the possibilities of Spark in this area!
The Internet of Things is a broad technolgy field,. There lots of interesting use cases and upcoming technologies to dive into. This session focusses on the 'back-end' of IoT solutions. After all, all this data has to be processed effectively to be truly meaningfull. For demonstration purposes, a software solution was created with Spark, Kafka and Cassandra to demonstrate these data flows. The solution is a marketing engine that enables organizations to target their potential customers based on historical and real-time data, thereby both building a strong user profile and responding to events as they happen. Join this session to get an overview of a (nearly) fullblown analytics application according the lambda architecture and reactive manifesto, and to get inspired to set up your own predictive IoT solution! In this session, streaming data from IoT sources (sensors) will be pulled into an analytics engine and combined with historical data. We use Spark as the technology of choice, since this framework is well suited for combining streaming data with machine learning techniques. Join this session to get an overview of a (nearly) fullblown analytics application, and to get inspired to set up your own predictive IoT solution! The outline of this session is: - short context sketch about the issues we face on back-end applications in the IoT - explanation of several design patterns and architecture principles, such as lambda architecture and principles of the reactive manifesto - presentation of a case study: personalized marketing with historical and real-time data flows - deep dive into the architecture of the solution for the case study, thereby coming back to the before mentioned patterns and practices - wrap-up with summary and time for questions The audience will learn theoretical concepts, and sees how to apply them in the real world.