Chief Executive Officer of BBVA Data & Analytics
Abstract: Creating value with big financial data
During this session, Elena gives her expert insight into how to innovate and create value with big financial data, covering from the challenges that non-digital native companies have to face in order to unleash the value of data to data-driven measurement. She will illustrate all these ideas with several real case-studies on financial business applications, but also on the creation of new data products based on financial data
Elena Alfaro is the Chief Executive Officer of BBVA Data & Analytics, a spin off of BBVA Bank (Spain), which mission is to create new products and services based on internal and external data sources, and advanced analytics. Elena has achieved international recognition through various publications around Big Data applications to urban management and planning issues. Before joining BBVA, Elena worked at Ericsson, playing an important role in telecommunications development in EMEA and LATAM, and broadening afterwards her experience as an expert in innovation. Regarding her background: Elena has a BA in Economics & Business Administration by the Universities of Sunderland (UK) and Universidad Autonoma (Madrid), and a Master in Intangibles Analysis and Management. She has taken part in the program “40-under-40 Young European Leaders” in 2013 edition.
Managing Director HERE Deutschland GmbH
Abstract: Intelligent use of location data – new opportunities for mobility
Like all areas, the mobility and car industries are undergoing a digital disruption. This digital revolution holds the promise of addressing many of the significant challenges posed by rapid increases in global urbanization. The infrastructure in cities and dense urban areas is collapsing, and the global urbanization level is expected to increase to 2/3 of the world’s population by 2050. Since 1995 more people are living in cities than in the countryside. Between now and 2030, it is estimated that an investment of $57 trillion will be needed to build and maintain infrastructure worldwide to meet current demographic and growth trends, while at the same time acknowledging that space for new roads and new bridges is limited. However, digitalization and the intelligent use of data for mobility solutions and traffic management will improve the situation dramatically. Automated systems, including automated driving, are safe, sustainable and will enhance social inclusion by ensuring mobility for all, including elderly and disabled users. Location platform services compiled on cloud based platforms will enable the creation of a live depiction of road environments and innovative services like hazard warnings, on street parking guidance and road sign detection. Using the power of an intelligent cloud-based location platform, HERE empowers customers to achieve better outcomes for multiple use cases, from helping a city manage its infrastructure to enabling an enterprise to optimize its assets to delivering drivers to their destination safely.
Michael Bültmann is since November 2014 managing director of HERE Deutschland GmbH and before acquisition of end-devices by Microsoft in 2014 six years managing director of Nokia GmbH.
Before he worked in different positions in Germany and France, i.e. in controlling and as senior legal counsel, in charge for Germany and countries in Eastern Europe.
He worked before as attorney in Berlin and Paris and lecturer for International Business Law at the university Lüneburg.
Michael Bültmann is active in a lot of associations, e.g. board member of BITKOM, Federal Association for Information, Technology, Telecommunications and New Media and board member of SRiW (Association of Self-Regulation in the Internet)
He is married and has 2 children.
Healthcare Industry Leader, IBM Global Business Services
Abstract: Cognitive Assistance Systems in Health Care – Experiences and Potentials
Story: Dr. Watson, bitte melden!
With IBM Watson a new computer era is emerging: the cogintive era.
The intelligent system IBM Watson isn’t programmed any longer, but trains and learns like we humans do. The cognitive computer system supports the user to a whole new extent: as smart assistant, enriching one’s expert knowledge and making information faster accessible. Expertise and extensive knowledge become more and more important. At the same time schedules become tighter. How would it be, if computers could understand complex questions, scan literature for them and answer objectively in natural language?
The cognitive computer system IBM Watson makes these scenarios happen, because Watson understands the natural language. The computer can read reference books and comprehend and summarize – in its own way – their content. Whenever one asks it a question, it “browses” through its library, reads texts and searches for answers. Watson can also draw logical conclusions, construct hypotheses for an answer and assess them. That way the computer system finds relevant information and processes them. With this it supports and accelerates experts’ decisions.
Cognitive systems are based on a fundamental, new approach. They learn from interaction with data and users and therefore can adapt to new conditions and changed tasks without requiring new programming. While “reading”, the system learns how to handle the meaning of words in different contexts. It can improve by training and feedback.
IBM Watson is already in action in the health-care sector around the world. Not only in cancer research, but also in diverse other sectors Watson supports physicians or patients directly in their clinical daily routine. Medical knowledge is available predominantly in natural language, e.g. as free text either in patient records, in medical professional articles or in medical reports. Up to now, these data weren’t evaluable by computer systems. The greatest barriers were the language – speaking about English, German or French – on the one hand as well as the understanding for medicine – medical concepts, correlations and terminologies – on the other hand. Watson has been trained for this for the past years and today is a qualified assistant physician so to speak.
In her keynote Dr. Deutsch will introduce typical cognitive application scenarios in the medical sector and outline future scenarios.
Dr. Eva Deutsch leads the Watson Healthcare Initiative in Europe in the sector of IBM Global Business Services. As a graduate of TU and MU Vienna in medical informatics she has been working in the consulting and IT environment of health care for 20 years. Besides long-term activities in the German-speaking world she also has experience from many international projects and leadership tasks. Since 2011 she has been working in the area of cognitive systems in medicine – in 2014 she took over the role as the leader of Watson Healthcare in Europe for the IBM Global Business Services Sector. Among other things, Dr. Deutsch also leads the international Competence Center for Medical Linguistics with their headquarter based in Austria.
Former Creative Executive, Senior Vice President at Walt Disney Imagineering
Abstract: Designing Experiences Through Story
I will explain how story and myth are key to defining the human experience and how we can use story to guide the design of anything where human engagement with the product is key. Story can be manifested as subtext, and as narrative. Each informs design choices differently. Story as subtext is where story is developed as a manifesto for use by design teams to coalesce ideas and create unity and focus while not necessarily being 100% obvious to the end user, but has a sub conscience effect. Story as narrative is where the story is front and center in the experience design and unfolds to the user much like a book or movie. I will provide examples of each from projects I’ve worked on around the world.
Design and user experience is most powerful when it connects with people on an emotional level first. Good story taps into human emotion, which in turn can guide design choices with the most impact. Some of the best stories are the simple ones, and simplicity is key to user friendly design. Good stories unfold logically with an understandable hierarchy of importance. Applying that principle to design and user experience results in product that has appeal and natural user interface. Using story as either design subtext or narrative help people to “see themselves” in the design. In other words, universal human experiences, rooted in primal emotions like happiness, wonder and curiosity create common touch points that feel real and comforting to the user.
With more than three decades of Disney experience, Joe Lanzisero is well equipped for his role as creative executive in charge of projects for Walt Disney Imagineering. Working with teams of artists, writers, architects and engineers, he serves as the eyes and artistic conscience of a project from conception through completion.
Joe was responsible for the creative development of the two newest ships for the Disney Cruise Line, and oversaw the teams that designed these new state-of-the-art ships (Disney Dream and Disney Fantasy) which launched in 2011 and 2012 respectively. Many features such as the innovative dinner show “Animation Magic” and the inclusion of an onboard water coaster (the AquaDuck) are cruise industry firsts. At Hong Kong Disneyland, Joe oversaw the expansion of the park by more than 20 percent over a three-year period. The additions of three new lands – Toy Story Land, Grizzly Gulch and most recently, Mystic Point, adds more excitement and fun for guests of all ages. Lanzisero began his Disney career in 1979 in Feature Animation (now Walt Disney Animation Studios), working on the animation, special effects, storyboarding and story development of numerous features, shorts and special project. He came to Imagineering in 1987 as a concept designer and was on the design teams for Disney’s Typhoon Lagoon Water Park at Walt Disney World, Critter Country at Disneyland, and Phantom Manor at Disneyland Paris. In 1991, Lanzisero was promoted to senior concept designer and immediately plunged into the development of Mickey’s Toontown, the wacky cartoon “community” that opened at Disneyland Park in 1993. He also developed the concept for Roger Rabbit’s Car Toon Spin, a wild and funny dark ride that opened in Mickey’s Toontown the following year. Lanzisero also supervised the concept design for the Tokyo Disneyland version of Toontown that opened in 1996.
Managing Director, CEO Know-Center
Abstract: European Network of National Centers of Excellence dedicated to Research in Big Data and Data Science
Story: Europa braucht mehr Öl
Prof. Stefanie Lindstaedt (TU Graz, Know-Center) and Prof. Volker Markl (TU Berlin, Berlin Big Data Center) will introduce this initiative running under their coordination and funded by BMWI (Smart Data Forum, Germany) and BMVIT (Austria).
Goals of this Network of Centers of Excellence (CoEs) are:
- Create more transparency and visibility of cutting edge Big Data research in Europe
- Increase exchange and collaboration between national CoEs both in research and education
- Foster matchmaking between industrial and research programs and initiatives
- Share best practices and lessons learned, identify synergies, and learn from each other
Europe is home to more than 25 CoEs that perform cutting edge research and drive the development and evolution of all aspects of Big Data. Each of these CoEs approaches Big Data from a different scientific background and with a different focus. Each of them alone can only cover a small part of the whole Big Data picture, but together their competences can define the data-driven Future of Europe. In addition, many of them also act as innovation engines to the local economy. However, European companies are not sufficiently aware of this asset.
This networking event is supposed to be a kick-off to all those cutting edge think tanks. Main focus of the Center of Excellence (CoE) Network will be research topics and how they can be transferred into relevant industries. In the future, exchange of academic expertise across the CoE Network will support further research, enable the role-out of promising projects across Europe, and strengthen the European economy.
Univ.-Prof. Dr. Stefanie Lindstaedt leads the Knowledge Technologies Institute within the Department of Computer Science & Biomedical Engineering at Graz University of Technology. She is also Managing Director of Know-Center GmbH, Austria‘s leading Research Center for Data-driven Business and Big Data Analytics. Know-Center is funded by the Austrian COMET program and 50% owned by Graz University of Technology. By undertaking applied science projects, Know-Center bridges the gap between science and industry.
Stefanie has an excellent track record in leading large interdisciplinary research projects and groups, developing young scientists and in obtaining EU and Austrian funding (more than € 44.5 million over the last 10 years). She scientifically coordinated the large integrated projects MIRROR IP and APOSDLE IP. In addition, she is/was a key partner in many other EU-funded projects, such as the LAYERS IP, MATURE IP, STELLAR NoE, OrganicLingua, etc. She has published more than 150 scientific papers in conferences and journals and supervised 18 Ph.D. theses. Stefanie is General Chair of European Data Forum (EDF 2015) and of I-Know conference series (since 2012), and Program Chair European Conference on Technology Enhanced Learning (EC-TEL) 2012.
Full Professor and Chair of the Database Systems and Information Management Group at the Technische Universität Berlin
Abstract: Big Data Management and Scalable Data Science: Key Challenges and (Some) Solutions
Story: Europa braucht mehr Öl
The shortage of qualified data scientists is effectively limiting Big Data from fully realizing its potential to deliver insight and provide value for scientists, business analysts, and society as a whole. Data science draws on a broad number of advanced concepts from the mathematical, statistical, and computer sciences in addition to requiring knowledge in an application domain. Solely teaching these diverse skills will not enable us to on a broad scale exploit the power of predictive and prescriptive models for huge, heterogeneous, and high-velocity data. Instead, we will have to simplify the tasks a data scientist needs to perform, bringing technology to the rescue: for example, by developing novel ways for the specification, automatic parallelization, optimization, and efficient execution of deep data analysis workflows. This will require us to integrate concepts from data management systems, scalable processing, and machine learning, in order to build widely usable and scalable data analysis systems. In this talk, I will present some of our research results towards this goal, including the Apache Flink open-source big data analytics system, concepts for the scalable processing of iterative data analysis programs, and ideas on enabling optimistic fault tolerance.
Volker Markl is a Full Professor and Chair of the Database Systems and Information Management Group at the Technische Universität Berlin (TUB) and an Adjunct Full Professor at the University of Toronto. He is Director of the Intelligent Analytics for Massive Data Research Group at DFKI and Director of the Berlin Big Data Center. In addition, he serves as the Secretary of the VLDB Endowment. His current research interests include new hardware architectures for information management, scalable processing and optimization of declarative data analysis programs, and scalable data science. To date, Volker has presented over 200 invited talks in numerous industrial settings, major conferences, and research institutions worldwide. Furthermore, he has authored and published over 100 research papers at world-class scientific venues. Between 2010-2016, he was Speaker and Principal Investigator of the Stratosphere Research Unit funded by the German Research Foundation (DFG), which resulted in numerous top-tier publications, as well as the Apache Flink big data analytics system. In 2014, he was named one of Germany’s leading digital minds (Digitale Köpfe) by the German Informatics Society. Prior to joining TUB, he was a Research Staff Member and Project Leader at the IBM Almaden Research Center in San Jose, California.
Scientific Director Telefonica
Abstract: Towards Human Behavior Modeling from (Big) Data
We live in a world of data, of big data, a big part of which has been generated by humans through their interactions with both the physical and digital world. A key element in the exponential growth of human behavioral data is the mobile phone. There are more mobile phones in the world as humans which has turned the mobile phone into the piece of technology with the highest levels of adoption in human history. We carry them with us all through the day (and night, in many cases), leaving digital traces of our physical interactions. Mobile phones have become sensors of human activity both in the large scale and also as the most personal devices.
In my talk, I present some of the work that we are doing at Telefonica Research in the area of modeling humans from human behavioral data collected from mobile phones. The data may be data that is automatically collected by the mobile network infrastructure –in particular Call Detail Records or CDRs—or data that is collected by an experimental mobile application through a user study. All projects entail data analytics and machine learning in order to build accurate models of individual or aggregate behavior.
In particular, I describe four projects: (1) a project to automatically infer personality from Call Detail Records ; (2) MobiScore, a project to automatically assess a credit score from Call Detail Records ; Borapp, a mobile app that is able to detect boredom from patterns of phone usage  and a project to automatically predict crime hotspots in a city from human dynamics and demographic data .
I conclude by highlighting opportunities and challenges associated with building data-driven models of human behavior.
Nuria Oliver is Scientific Director at Telefonica R&D. She graduated top of her class and top of Spain in Electrical Engineering and Computer Science from the ETSIT at the Universidad Politecnica of Madrid (UPM), Spain in 1994. She received her PhD degree from the Massachusetts Institute of Technology (MIT, 2000). From July 2000 until November 2007, she was a researcher at Microsoft Research in Redmond, WA. After 12 years in the US, she returned to Europe to co-create the Research organization at Telefonica R&D by creating and leading her internationally recognized industrial research team in Barcelona. She is the first female Scientific Director at Telefonica. She is a Senior Member of IEEE and honored to be the first Spanish female computer scientist named a Distinguished Scientist of the ACM.
Principal Data Scientist at #ZeroG
Abstract: How Big Data Is Re-Defining The Customer Experience In The Aviation Industry
The aviation industry is not only facing huge pressure on the cost side but also a profound disruption in marketing and service not experienced in previous decades. Andreas Ribbrock of # zeroG – A Company of Lufthansa Systems will talk about how airlines use data driven decision making in order to create a unique customer experience along the entire customer journey.
Andreas’ presentation will cover how airlines today drive the digital transformation of their business towards a true customer centric organization focusing on increasing customer loyalty and brand equity. It will include aspects of how to change from traditional data handling along departmental siloes towards treating data as a shared asset. His talk will include how leading airlines use advanced analytical methods today and what actions need to be taken from analytical insights along the customer journey to deliver personalized, relevant and useful information and services for each individual traveler at the right time on the right channel. Andreas will talk about the challenge of combining analytical insights, real-time context information, and business drivers to achieve holistic decision making to create a unique Customer Experience.
Andreas is holding a Ph.D. in computer science from Bonn University. His fields of study included signal processing, data structures and algorithms for content based retrieval in non-relationally structured data like images, audio signals and 3D protein molecules. After finishing his Ph.D., Andreas joined the analytics heavyweight Teradata. Here Andreas held multiple positions including Solution Architecture and Big Data Architecture delivering analytical projects for global players like DHL Express, Deutsche Post, Lufthansa, Otto Group, Siemens, Deutsche Telekom, and Volkswagen. In 2012 Andreas become the team leader for the data science practice of Teradata in Germany delivering leading edge data science solutions in various industries.
In 2015 Andreas decided to join the Cologne based IoT database startup ParStream as the head of product management and data scientist further developing the analytical capabilities of ParStream. After the acquisition of ParStream by Cisco, Andreas accepted the challenge of building a big data architecture and data science team for the Lufthansa Systems spin-off #ZeroG. Andreas presented at international conferences on topics related to big data architectures, data science and data warehousing.
Director & Head of Design, Google
Big Data Visualization
Russ Wilson joined Google as Director & Head of Design at in Feb. 2016. Russ Wilson joined Microsoft in December 2013 as a Partner Director and Head of Design for Business Intelligence, a collection of products and services that enable you to visualize data, share discoveries, and collaborate in intuitive new ways. Prior to Microsoft, Russ was the founder and Director of IBM’s Mobile Innovation Lab. The mission of the lab is to identify disruptive mobile technologies and trends and transform them into new products and solutions that deliver significant value to individuals and companies. Russ also served as the Head of Design for IBM’s Application and Integration Middleware business unit, providing design leadership for Websphere, MobileFirst, Pure, IoT, and SmarterProcess. As one of IBM’s new Design Executives, Russ helped to teach and evangelize design thinking throughout IBM, and influence a new era in software product development. Prior to IBM, Russ was the worldwide Senior Vice President of User Experience at CA Technologies where he led the development of an enterprise-wide design language and a common application platform. Russ joined CA as part of the acquisition of NetQoS, where Russell was one of the original executives. Russ is a serial entrepreneur, an experienced executive, and has proven success in both startups and large companies. His interests include approaches to innovation, user experience design, rapid prototyping, and team building.
Head of Media Technology, CIO, Managing Director Styria Media Group AG
Abstract: Content meets Technology – Digitalization in the Media Industry
The ongoing digitalization has led to a radical change in the media industry. Regionally-oriented companies like the Styria Media Group have been forced to rethink their business strategy to reposition themselves on the market.
Thomas Zapf provides an insight in these challenges and how the Styria Media Group is handling them successfully.
Thomas Zapf started his career as IT Manager for Tridonic Lightning Components in 1999 in Graz. From then on, he worked as SAP consultant in international projects for Siemens Business Services.
In 1999, Thomas Zapf changed to Magna Steyr as SAP Manager in Graz.
From 2004 to 2007, he worked as IT Director for Austrian Energy & Environment AG. Right after, he came back to Magna Steyr as Head of IT Infrastructure & Operations, responsible for process and cost optimizations.
In 2011, Thomas Zapf started as CIO at Sulzer AG in Switzerland.
Two years after, he came back to Austria and started as CEO of Styria IT Solutions in the Styria Media Group. In 2015, Thomas Zapf also took on the responsibility for all digital services. As CIO and Head of Media Technology, he is now responsible for all technological developments within the group.
Vice President Contract Manufacturing Magna Steyr
Abstract: Smart Factory @ Magna Steyr
The line between the digital and the real world is increasingly blurred. Only companies that develop and evolve constantly will remain competitive in the market. Companies in industrialized countries with their high labor costs in particular have no option but to improve their competitiveness in order to safeguard their foothold in the market and thus protect the long-term viability of their locations. The Smart Factory by Magna Steyr with its real use cases is a significant help to master this challenge.
This presentation addresses the megatrends in the automotive industry – the trends which allow us to anticipate the demands the production systems of the future will need to satisfy. This includes the essential requirement of building a “digital twin” to trace the interactions between the virtual and real worlds as a basis for the implementation of the Smart Factory.
In addition, the presentation will discuss the vision of an adaptable production system controlled by a network of intelligent objects and production segments featuring autonomous, self-organized and data-based production processes.
A large variety of application cases will be described in which Magna Steyr’s innovative Smart Factory solutions have been used successfully to improve competitiveness. A visionary outline of the production of the future wraps up the presentation.
Dr. Wolfgang Zitz studied Process Engineering at TU Graz and received his doctorate in 1992 (dissertation topic “Treatment of Wastewater from Collection Pits in Municipal Sewage Treatment Plants”). In 1992 he began his career at Magna Steyr (formerly Steyr Daimler Puch Vehicle Technology) as a technical assistant to the SUVs production manager (now Mercedes-Benz G-Class). From 1994, he acted in various positions in the paint shop and assumed more and more responsibilities. In 2008 he was appointed General Manager of Paint and Body. In this role, Dr. Zitz was responsible for Magna Steyr’s paint and body technology worldwide and the project AMG in Graz. In 2011 and 2012 he was responsible for strategy, efficiency and innovation of the Magna Steyr production areas. In November 2012 Dr. Zitz was appointed Vice President of Contract Manufacturing and is responsible for all production areas of Magna Steyr in Graz/A and Hambach/F.
SVP Digital Business innovation Infonova GmbH, Partner at BearingPoint
Abstract: Product and service bundling and how to monetize data – lessons learnt from the Telco industry
Leading companies across multiple sectors are now racing to incorporate Digital Ecosystem Management (DEM) into their business model. It’s not about trying to become the next Apple or Uber – simply about creating rapid new growth. Flexibility in bundling products and services, flexibility in pricing and combining business model patterns is key to generate quick value for clients.
The concept of DEM System is a new breed of software with a long heritage in complex multi-party multi-service monetization from the telecoms sector. It easily integrates with legacy systems and processes, so service providers, their partners and other third parties can start using it straightaway.
The TeleManagement Forum (TM Forum) a non-profit industry association for service providers and their suppliers originally defined standards for the telecommunications industry. Driving the next wave of digital business growth – the digitization of every industry – TMForum has defined the new Framworx a suite of best practices and standards. Now organizations across industries and ecosystems can work together and agree on common ways by providing a common innovation platform to connect businesses, industries, and ecosystems. Including monetising of data driven business.
Infonova R6 is a Digital Ecosystem Management Platform with multi-tenant concept-to-cash Business Support Services (BSS) capabilities at its core. It is TM Forum Frameworx conformance certified, supports organisations by providing business solutions needed for the creation, delivery and monetization of innovative digital services including data driven business and multi-party ecosystem management. This enables the monetization of end-customer business relationships and caters for sharing revenues and allocation of costs with all service providing partners on the platform.
Offering digital services involves working with complex scenarios where multiple partners and suppliers collaborate in bi-directional, multi-level revenue chains in various B2B2x business models. The Infonova R6 product empowers businesses with ability to manage partners, monetise relationships and offerings and scalability for their services in the digital economy environment.
Gerhard started his career in the early 1980th as an assistant professor at the Institute for Information Processing and Computer Based Media at Graz Technical University. He holds a Master degree in Technical Mathematics and Information- and Data processing.
Gerhard joined Infonova in 1998 and started his career as director Marketing and Sales and became managing director in 2004.
In 2009 he also became partner at BearingPoint a multinational management and technology consulting firm headquartered in Amsterdam, Netherlands. It has operations in 21 countries with around 3,800 employees and is one of the largest management consultancies in Europe with global operations.
He has more than 30 years of experience in the communication, media and entertainment industry and supported large Austrian and European based telecommunication, cable and broadcasting in implementing CRM and Billing systems. Recently he focuses on implementing new Digital Eco Systems based on Infonova’s R6 concept-to-Cash Platform for cross industry digital service providers.
He is an international speaker on conferences and initiated the Digitaldialog in Styria. A monthly series of presentation focusing on new technologies and innovations in the rapid changing digital world. He is member of the board of the “Technology and Society Forum” of Graz Technology University.
Abstract: Sensor-enabled Data Analytics for Smart Production
Data Analytics – gaining new insights from data – is developing into a key discipline in many industries. In smart production, data analytics is the driver for process optimisation, including quality control, monitoring machines and production lines or predictive maintenance. In particular the combined analysis of existing data with newly generated data from appropriate sensors, integrated in real-time computing environments, enables advanced connected applications in smart production processes. Some actual deployments of such applications will be presented.
Harald Mayer is the head of the Intelligent Information Systems research group and deputy director of the Institute DIGITAL. He has strong expertise in web-based information systems and multimedia technologies. In 1990 he began as a research engineer and project manager at JR, Institute of Information Systems. He has more than 10 years of experience in project management and was also the responsible Administrative Coordinator of major European projects (APOSDLE, NEXT-TELL). He has also been working for the European Commission as an expert to evaluate proposals and review projects.
Birgit Kornberger completed her studies in Technical Mathematics at Graz University of Technology in December 2004 and joined the JOANNEUM RESEARCH Institute for Economic and Innovation Research, research group Statistical Applications, as junior researcher in January 2005. She obtained experience in the application of statistical methods and models in various application fields (environmental, food, industrial, etc.). Her main research interests are predictive analytics methods (generalized additive models, time series modelling, dimension reduction techniques, …), reliability modelling and prediction, as well as developing R programs and applications.
Abstract: Digital Transformation, Industry 4.0 and Smart Services – What really matters?
When it comes to Industry 4.0 it is not just about automation and increasing efficiency. It’s a lot more, because the global competition about data is already in full swing. Companies that don`t adapt to the changes can quickly fail due to the “Darwinian Principle” that has already hit some market players.
How can you gain from the business opportunities that come along with the digitization? Industry 4.0 is about bringing physical and digital services together so that new intelligent deals arise based on digital refined business models. Similarly, you can use the data obtained e.g. through predictive analytics to learn more about future trends and developments.
We come along with a best practice example in the field of logistics that shows in a simple procedure ho you can start your digital transformation successfully!
Mr Gebhard is Senior Account Manager at eurodata tec GmbH (a subsidiary of the eurodata group). He is a data specialist as he has many years of experience when it comes to Business Analytics, Data Management and Data Integration as well as Big Data. His expertise has been asked in numerous customer projects. He is holding a diploma in Business Informatics. He worked amongst others for Mindjet and Cognos before he joined eurodata.
Thomas Kolomaznik has over 20 years experience in the IT industry. After he completed successfully an elite college for IT data processing, he worked over the next 12 years at Cognos. There he was responsible for the entire construction of the Cognos Business Services and partners in Eastern Europe. In 2011 he founded comesio, where he became managing director in 2015 and also managed the merger with eurodata AG.
Head of AVL Integrated and Open Development Platform
Abstract: Into the data driven future with the connection of virtual and real worlds – How to get in the driver’s seat
The automotive industry has always been known to push existing boundaries and find innovative new concepts and solutions. Especially in recent times, the growing complexity in vehicle development demands for new transformational approaches in order to bundle existing knowledge and capabilities. This knowledge, tremendously precious and complex, is stored in the minds of employees, in processes, in simulation models, located in data sources, in descriptions of requirements in test methods and of course in the products and their components. It is of utmost importance to sustain existing expertise and simultaneously forge new paths in order to increase efficiency and agility. It is AVL’s conviction that this goal can only be reached via the integration of virtual and real worlds throughout the entire development process…
Abstract: Oracle Big Data Discovery for CERN’s Control Data
CERN’s particle accelerator infrastructure is comprehensively heterogeneous. A number of critical subsystems, which represent cutting-edge technology in several engineering fields, need to be considered: cryogenics, power converters, magnet protection, etc. The historical monitoring and control data derived from these systems has persisted mainly using Oracle database technologies, but also other sorts of data formats such as JSOM, XML and plain text files. All of these must be integrated and combined in order to provide a full picture of the overall status of the accelerator complex. Therefore, a key challenge is to facilitate easy access to, flexible interaction with, and dynamic visualization of heterogeneous data from different sources and domains.
In this session we will present several CERN use cases where Big Data technologies are being used and how Oracle Big Data Discovery enables an easy integration of different data sources, empowers powerful analytical capabilities and unlocks hidden correlations and insights.”
Antonio Romero joined CERN in 2011 to work for the Beams Controls group, providing solutions for the configuration needs of the Accelerator Controls Systems using database and java technologies.
Currently, he works in the IT Database group for the Scalable Analytics Solutions section supporting the data analytics needs for the research and development activities of the CERN users community.
He is involved in multiple areas of the CERN Big Data and Analytics infrastructure including big data visualization and discovery, ETL processing at scale, data management, integration and deployment of distributed applications and the use of machine learning for different CERN use cases like the improvement of the operation and exploitation of the CERN Accelerator Complex.”