Group sparse optimisation via lp,q regularisation

Dr Kaiwen Meng, associate professor at School of Economics and Management, Southwest Jiaotong University

Synopsis

In this paper, we investigate a group sparse optimisation problem via lp,q regularisation in three aspects: theory, algorithm and application. In the theoretical aspect, by introducing a notion of group restricted eigenvalue condition, we establish an oracle property and a global recovery bound for any point in a level set of the lp,q regularisation problem, and by virtue of modern variational analysis techniques, we also provide a local analysis of recovery bound for a path of local minima. In the algorithmic aspect, we apply the well-known proximal gradient method to solve the lp,q regularisation problems, either by analytically solving some specific lp,q regularisation subproblems, or by using the Newton method to solve general lp,q regularisation subproblems. In particular, we establish a local linear convergence rate of the proximal gradient method for solving the l1,q regularisation problem under some mild conditions and by first proving a second-order growth condition. As a consequence, the local linear convergence rate of proximal gradient method for solving the usual lq regularisation problem (0<q<1) is obtained. Finally in the aspect of application, we present some numerical results on both the simulated data and the real data in gene transcriptional regulation.

Biography

Dr Kaiwen Meng is an Associate Professor at School of Economics and Management, Southwest Jiaotong University. He works on finite dimensional optimisation problems, focussing on decisions with multiple objectives, penalty function theory, methods and applications, generalized polyhedra theory and applications, and portfolio selection theory. Kaiwen received his PhD in applied mathematics from the Hong Kong Polytechnic University in 2011. He has published around 15 papers in international journals, including SIAM Journal on Optimization, Operations Research, and Journal of Machine Learning Research.

Distributed Deep Learning At Scale on Apache Spark with BigDL

Jason Dai, senior principal engineer and CTO, Big Data Technologies, Intel

Synopsis

BigDL is an open source distributed deep Learning library, which is natively integrated with Apache Spark and provides rich deep learning functionalities for Spark. It combines the benefits of high performance computing and Big Data architecture, so as to provide orders of magnitude speedup than existing out-of-box open source DL frameworks (e.g. Caffe/Torch/Tensorflow) on single-node Xeon, and efficient scale-out based on the Spark architecture; in addition, it allows data scientists to perform distributed deep learning analysis on big data using the familiar tools including python, notebook, etc.

In this talk, we’ll show how our users can easily build Deep Learning and AI-powered Big Data analytics (e.g. image, speech, NLP, time series, etc.) using BigDL, together with other libraries on Apache Spark (e.g. Spark SQL and Dataframes, Spark ML pipelines, Spark streaming, etc.); this allows the users to use their Big Data (Hadoop/Spark) cluster as the unified data analytics platform for data storage, data processing and mining, feature engineering, traditional machine learning, and deep learning applications.

Biography

Jason is currently a senior principle engineer and CTO, Big Data Technologies, at Intel, responsible for the research and development on advanced Big Data analytics (including distributed machine/deep learning), as well as collaborations with leading research labs (e.g. UC Berkeley AMPLab), with global engineering teams located in both Silicon Valley and Shanghai. He is an internationally recognised expert on big data, cloud and distributed machine learning; he is a committer and PMC member of Apache Spark project, programme co-chair of Strata Data Conference Beijing, and the creator of the BigDL project, a distributed deep learning framework on Apache Spark.

Dr Hongjuan Liu, director, insight and analytics, Aegon Insights Limited

Synopsis

Insurance industry is among those early adopters of data and analytics (e.g. actuarial science) while has been relatively slow in adopting Big Data when compared to other industries especially e-commerce. In this talk, Dr Liu shares his observations of how insurance industry has evolved with the fast development of Big Data and Analytics technologies, then presents some of the latest development of AEGON Insights’ analytics practice across Asia Pacific. Further research and development opportunities are also provided.

Biography

Dr Liu is currently Regional Director of Insight and Analytics for Aegon Insights Asia Pacific, a subsidiary of the AEGON group focusing on insurance distribution and analytics. Dr Liu is an analytics professional with both hands-on and leadership experience in leveraging data analytics for insurance pricing, sales, and marketing across various distribution channels including digital distribution and diverse population segments. Prior to AEGON, Dr Liu had held various technical and leadership positions in global insurance players such as Allianz UK, Prudential Corporation Asia and Cigna International.

Dr Liu earned his bachelor degree of Mathematics and Applied Mathematics from South China Normal University, and he also holds a Master’s degree in Operational Research and a PhD degree in Management Science from Lancaster University in the UK.

Big Data for financial analysis

Dr Victor Chang, Xi’an Jiaotong-Liverpool University

Synopsis

We are in an era that needs to collect and process a large amount of information quickly, accurately and precisely and sends the refined outputs to different types of decision-makers and stakeholders. Big Data can play influential roles in large scale data processing, interpretation, analytics and forecasting. In this invited talk, selected examples and illustrations in different disciplines, such as healthcare, finance, social network, weather studies, security and cloud computing/disaster recovery have been discussed. All these examples are based on the summary of leading published journal articles led by the keynote speaker. Recent work financial big data analysis based on improved Heston Model, Monte Carlo simulation, Black Scholes simulation and Organisational Sustainability Modelling will be demonstrated to show financial stock analysis, business performance visualisation, trading and forecasting can be balanced and achieved, not only thousands of data processing in seconds to be completed but accuracy 95 per cent and above for analysis and forecasting to be achieved. Latest development in financial big data processing, analysis, prediction and visualization will be presented with details. Research contributions in financial big data and big data in other disciplines will be explained thoroughly, including further analysis, outcomes and positive impacts to demonstrate that Big Data solutions can improve the quality of their services and outputs.

Biography

Dr Victor Chang is an Associate Professor in Information Management and Information Systems at International Business School Suzhou (IBSS), Xi'an Jiaotong-Liverpool University, China. He is a Director of PhD Program and the 2016 European and Cloud Identity winner of "Best Project in Research". Victor Chang was a Senior Lecturer in the School of Computing, Creative Technologies at Leeds Beckett University, UK and a visiting Researcher at the University of Southampton, UK. He is an expert on Cloud Computing and Big Data in both academia and industry with extensive experience in related areas since 1998. He completed a PGCert (Higher Education) and PhD (Computer Science) within four years while working full-time. He has over 100 peer-reviewed published papers. He won £20,000 funding in 2001 and £81,000 funding in 2009. He was involved in part of the £6.5 million project in 2004, part of the £5.6 million project in 2006, part of a £300,000 project in 2013 and part of £50,000 project in 2016. He won a 2011 European Identity Award in Cloud Migration and 2016 award. He was selected to present his research in the House of Commons in 2011 and won the best papers in 2012 and 2015. He has demonstrated ten different services in Cloud Computing and Big Data services in both of his practitioner and academic experience. His proposed frameworks have been adopted by several organizations. He is the founding chair of international workshops in Emerging Software as a Service and Analytics and Enterprise Security. He is a joint Editor-in-Chief (EIC) in International Journal of Organizational and Collective Intelligence and a founding EIC in Open Journal of Big Data. He is the Editor of a highly prestigious journal, Future Generation Computer Systems (FGCS). His security paper is the most popular paper in IEEE Transactions in Services Computing and his FGCS paper has one of the fastest citation rate. He is a reviewer of numerous well-known journals and had published three books on Cloud Computing which are available on Amazon website. He is a keynote speaker for CLOSER 2015/WEBIST2015/ICTforAgeingWell 2015 and has received positive support. He has given keynotes in Math-CEng 2016 Indonesia, SPNCE 2016 Guangzhou China, ICCMIT 2017 Chennai India, ICRTCCM 2017 India, ICEIS 2017 Porto Portual, ICEMIS 2017 Tunisia, DeSE 2017 Paris France and GI 2017 Fuzhou China. He is the founding chair of IoTBDS and COMPLEXIS conferences. He won 2017 Outstanding Young Scientist Award.

Big Data Analytics: Another buzzword within an old concept - but why?

Dr Gangmin Li, Xi'an Jiaotong-Liverpool University

Synopsis

Data analysis is nothing new. The use of data started the very first day data was invented and recorded. Data analysis is an important part of data usage. To some extent, the history of human civilisation is the history of recording data, process data and use of data. But, why is big data so important and useful today? Why is the big data? The innovation and revolution are emerged and reflected in what aspects? This presentation tries to answer these questions from the speaker’s personal point of view after teaching Big Data Analytics and research in Big Data for many years. It covers:

  • Basic concepts of Big Data (BD) and Big Data Analytics (BDA)
  • Big Data Thinking (data-enabled organisation, data-driven decision making culture)
  • Big Data enabled analyses methods (explorative, descriptive, predictive and prescriptive analysis)
  • Platforms and tools that are capable of processing Big Data
  • Big Data friendly ecosystems (share and protect)

Biography

Dr Gangmin Li is a highly educated academic with over 30 years of research and teaching experience in four British and one Chinese university. His research interests include Big Data Analytics, Knowledge engineering, Agent and multi-agent systems, Distributed Systems, Grid Computing and HCI. He has worked on many British EPSRC research projects including ADAM, CORK, D3E, ScholOnto, Cohere, SocialLearn, OMII, MyGrid, GridSAM, and Grimories. He has produced a number of registered open-source software. He has over 40 publications and a new book on BDA in title of Big Data Analytics: concepts, platforms, algorithms and social due to be published in autumn 2017.

Towards Hybrid Evolutionary and Swarm Techniques for Big Data Analytics

Dr Kevin Kam Fung Yuen, Xi'an Jiaotong-Liverpool University

Synopsis

Big data analytics concerns modern statistical and machine learning techniques to analyse huge amounts of data with the challenging issues particularly including the high dimensionality and multiple objectives of the data analytics problems. With powerful search capabilities for optimization, Evolutionary and Swarm Algorithms (ESA) have the potential to address the above challenges in the big data analytics today. Combined ESA with other conventional and recent statistical and machine learning techniques, development of hybrid ESA techniques for Big Data Analytics is a fast-growing and promising multidisciplinary research area. This talk discusses the recent developments on hybrid evolutionary and swarm techniques for solving specific challenges of big data analytics.

Biography

Kevin Kam Fung Yuen received his PhD in Computational Intelligence and Operations Research, as well as his BSc (Hons) in Enterprise Engineering and E-business, from the Hong Kong Polytechnic University. He is currently an Associate Professor at the Department of Computer Science and Software Engineering, and Research Institute of Big Data Analytics, Xi’an Jiaotong-Liverpool University, China. He is also a Visiting Associate Professor at National Taiwan University of Science and Technology, Taiwan. Previously, he was an Assistant Professor at Zirve University, Turkey. He has published more than 65 papers in journals, book chapters and conference proceedings. He has currently obtained two research grant projects from National Natural Science Foundation of China (61503306) on “Fuzzy Cognitive Swarm Optimized Clustering Methods”, and Natural Science Foundation of Jiangsu Province (BK20150377) on “Cognitive Memetic Self-Organizing Clustering Algorithm for Data Analytics”. He is currently serving as the leading guest editor for Big Data Research journal special issue on “Hybrid Evolutionary and Swarm Techniques for Big Data Analytics and Applications”. He has served as the co-guest editors for Neurocomputing journal special issue on “Computational Intelligence Techniques for New Product Development”, and Engineering Applications of Artificial Intelligence journal special issue on “Artificial Intelligence for Product Engineering”. He serves as a reviewer for various journals and conferences, including about 40 sci/ssci journals. He has served as a technical programme committee member for more than 40 international conferences.

Machine Learning Applications In Financial Time Series Forecasting

Dr Qi Deng, Xi’an Jiaotong-Liverpool University

Synopsis

Dr Deng will first discuss the application of traditional econometrics tools (such as CAPM-based factor model, ARMA, GARCH/EGARCH, and their multivariate varieties, such as VECM-VAR and DCC/ADCC), in stock forecasting and portfolio construction and balancing. He will then demonstrate how these traditional tools can be enhance by their counterparties in the machine-learning domain. He will present deep neural network (DNN), recursive neural network (RNN), long short-term memory (LSTM) applications in univariate (“single stock”) time series forecasting, as well as convolutional neural networks (CNNs), convolutional RNNs (C-RNNs) and convolutional LSTMs (C-LSTM) applications in multivariate (portfolio) time series forecasting.

Biography

Dr Qi Deng is an associate professor in finance with IBSS-XJTLU since May 2014. He co-founded Altobeam (高拓讯达), an award winning technology company in 2007, and led its financial and business development activities till September 2016. In October 2016, Dr Deng founded Cofintelligence Technology Ltd. (上海浑度), a financial technology firm that specialises in machine learning based quantitative modeling and algorithms trading, and serves as its chief scientist. Dr Deng also provides consulting and training services to the finance industry. He is a senior quantitative advisor for several multi-million dollar quantitative hedge funds, and an educational consultant for Great Wisdom (China’s largest financial data provider). He is also an adjunct professor for Grenoble Ecole de Management, a leading French business school with triple-crown accreditation. His research interests are machine learning-based quantitative modeling, trading and hedging with a variety of securities. Dr Deng earned a BS in Physics from Peking University, a MBA from University of Michigan, and a DBA from Grenoble Ecole de Management. He holds a Certificate in Quantitative Finance (CQF).