Head of Data Scientist – Big Data Architect
||Telecommunication; Data Analysis
||Kuala Lumpur, Malaysia
||MYR 300,000 per annum
||2nd October 2019
Client Background & Role Summary:
Our client is listed on the Stock Exchange of Malaysia since 2001 and has been in operation for more than 20 years. They provide a full suite of domestic and international connectivity such as broadband services, and data centre solutions to the Enterprise, SME, Wholesale and Consumer markets. They are looking for a Head of Data Scientist – Big Data Architect.
- Use predictive modelling, statistics, machine learning, data mining, and other data analysis techniques to collect, explore, and extract insights from structure and unstructured data.
- Develop algorithms and applications to apply mathematics to data, perform large scale experimentation and work with developers to build data driven applications to translate data into intelligence, solve a variety of business problems and enable business strategy.
- Possess strong understanding of internal business segments (stakeholders) and possess strong written and communication skills. Typically requires expertise in relational database structures, research methods, machine learning, Cloud-based technologies, Big Data technologies (i.e. Hadoop , etc.), analytics packages (i.e. Tableau, etc.), scripting languages (i.e. Python, Perl), programming languages (i.e. Java, C/C++, SQL).
- Possess passion for advanced analytics or data science and a keen desire to solve business problems and find patterns/insights within structured and unstructured data. Demonstrated technical expertise in descriptive, diagnostic, predictive and prescriptive analytics.
- Bachelor/Master of Computer Science or Software Engineering.
- Experience in Multi-National Company with Managerial role in Big Data division is a must.
- Strong knowledge of and experience with statistics; potentially other advanced math as well.
- Deep knowledge in data mining, machine learning, natural language processing, or information retrieval.
- Experience processing large amounts of structured and unstructured data. MapReduce experience is a plus.
- Enough programming knowledge to clean and scrub noisy datasets.
- Knowledge of Apache Spark/Hadoop.
- Intermediate Knowledge of SAP Data Structure.
- Understanding Database Normalization in Structure & Un-Structure database.
- Knowledge of Relational Database like Oracle/MySQL/MSSQL/DB2.
- Knowledge of NoSQL Database like MongoDB.
- Deep understanding of Object Oriented Programming is a plus.
- Programming experience, ideally in PHP or Java.
If you are interested, kindly email your updated resume to Rachel at firstname.lastname@example.org or click “Apply Now”. Regretfully, only shortlisted candidates will be contacted. Thank you!