Advanced Analytics Lead - Asset Management

  • Competitive
  • Chicago, IL, USA Chicago IL US
  • Permanent, Full time
  • UBS
  • 07 Sep 18 2018-09-07

Your role Are you a data and insights driven leader? Have you led the development of advanced analytics capabilities within an organization? Are you passionate about leveraging ML and AI capabilities for investment research? Are you interested in becoming part of a data and analytics team, revolutionizing the investment management business?

We’re looking for someone to:

  • Play a leadership role in the formation of the team and platform to develop ML /AI and Quant capabilities for Investment Management
  • Lead the discovery and engagement with research and investment professionals to identify differentiating opportunities for application of advanced analytics
  • Provide thought leadership in the architecture of the data platform and processes to support advanced analytics
  • Work with Data Scientists to design testable and repeatable data collection and preparation steps during data model constructions
  • Be the mentor for the team in developing knowledge of investments industry and application of ML/AI techniques in the field of investment research
  • Data Scientists to deploy models to Test and Production environments
  • Responsible for overall governance and execution of planned projects and solutions
  • Ensure data quality and governance by teaming up with Data Governance team
  • Work with IT to oversee the implementation and integrity of the data platform

Your team
 
Chief Data Office (CDO), within UBS Global Asset Management (AM), was established to re-define AM’s data & analytics strategy and to lead the innovation agenda for the division. The CDO provides the foundation for establishing a data and analytics driven culture, by guiding the successful delivery of both foundational and innovative solutions, in support of AM's strategic agenda. To design multi-faceted innovative solutions, the CDO is structured to encompass several functions, including Program Management, Data Analytics, Data Solutions, and Data Governance.
 
Your experience and skills
 

You have:

  • advanced degree plus 12+ years of experience in data and analytics, with at least 3 years' experience in applying ML/AI capabilities within the industry.
  • strong knowledge of machine learning models with proven experience in producing predictive and prescriptive insights
  • strong working knowledge of investments or capital markets data – securities, trades, positions, valuations, risk and analytics
  • knowledge of AI/Deep Learning techniques, especially on applying these techniques on time series data through recurrent neural nets and unstructured financial information sources
  • proven experience in developing the data pipeline, platform, models and governance processes in delivering end-to-end solutions
  • hands-on solution driven attitude, with ability to turn around quick insights along with delivering strategic capabilities
  • creative mindset in creating compelling visualizations for insights
  • practical experience with different big-data platforms and software solutions
  • multi-tasking ability to participate in multiple complex programs at the same time in an agile environment
  • experience in implementing the required controls for data governance, quality and privacy to develop compliant solutions

You are

  • experienced leader in forming high caliber diverse team and delivering analytical solutions from conceptualization to realization
  • proficient in machine learning models, model tuning and methods to deal with both limited data sets and high volume time series data sets
  • proficient in SQL, data analytics language like Python or R
  • knowledge of NoSQL data stores like Cassandra, Redis, couchDB as well as relational DBs
  • familiar with experienced with a graph database like ArangoDB, DataStax Enterprise Graph, JanusGraph, or Neo4j
  • familiar with file systems like HDFS, AWS S3, or Azure Blob Storage
  • familiar with messaging systems like Kafka, RabbitMQ, or Sqoop
  • familiar with batch processing using Spark, Flink, Hadoop MapReduce, or Hive