IT Consultant – Data Science and Analytics

Birmingham, West Midlands
Gross per annum: £41000 - £ 43000
22 Sep 2016
20 Oct 2016
Contract Type
Full Time


Vichag is one of UK’s rapidly growing Business consulting companies specialising in providing consulting and business solutions to Industry and public sector enterprises across UK & Europe.

Currently we are looking for TWO experienced techno-functional IT consultants specialising in Data science and analytics in complex Enterprise applications scenarios.

Key Responsibilities     

  • Import data into HDFS and Hive using Sqoop.
  • Apply expertise in quantitative analysis, data mining, and the presentation of data to see beyond the numbers and understand how our users interact with our core/business products
  • Partner with product and engineering teams to solve problems and identify trends and opportunities
  • Inform, influence, support, and execute company product decisions and product launches.
  • Work in Hadoop and Hive, primarily SQL.
  • Author pipelines via SQL and python based ETL framework
  • Build key data-sets to empower operational and exploratory analysis
  • Automate analyses and set goals
  • Design and evaluate experiments monitoring key product metrics, undertake root cause analysis of changes in metrics
  • Build and analyse dashboards and reports
  • Understand ecosystems, user behaviors, and long-term trends
  • Communicate  state of business, experiment results to product teams.
  • Spread best practices to analytics and product teams.
  • Develop the Pig scripts and UDF'S to pre-process the data for analysis.
  • Create Hive tables to store the processed results in a tabular format.
  • Develop the sqoop scripts in order to enable the interaction between Pig and Database.
  • Analyse raw data, draw conclusions & develop recommendations writing SQL scripts to manipulate data for data loads and extracts.
  • Install, administer and maintain SQL server instances.
  • Taking care of the database design and implementation.
  • To create linked Servers to SQL Servers and other databases such as Oracle, Access etc.
  • To design database backup and restoration Strategy.
  • To create indexes, Views, complex Stored Procedures, user defined functions, cursors, derived tables, common table expressions (CTEs) and Triggers to facilitate efficient data manipulation and data consistency.
  • Data Extraction, Transforming and Loading (ETL) between Homogeneous and Heterogeneous System using SQL tools (SSIS, DTS, Bulk Insert, BCP, and XML).
  • Import/Export of Data from various data sources like Excel, Oracle, DB2 and Flat file using DTS utility. Transformation of OLTP data to the Data Warehouse using DTS and SQL commands.
  • Configure report server and report manager scheduling, give permissions to different level of users in SQL Server Reporting Services (SSRS). Generate on-demand and scheduled reports for business analysis or management decision using SQL Server Reporting Services.
  • Schedule and deploy the reports and upload files to a report server catalog from the report manger.
  • Work on streaming the analysed data to the existing relational databases using Sqoop for making it available for visualisation and report generation by the BI team.

Essential Experience

  • Experience in building Analytical applications.
  • Experience in Statistical modeling & algorithm designing, using various Machine Learning techniques.
  • Experience in Data Distributions & Transformations.
  • Experience in Statistical Summaries & Visualization.
  • Experience in Probability Theory.
  • Experience in Hypothesis tests & Significance.
  • Experience in Model selection & Goodness of Fit.
  • Detailed understanding & Hands-on on Hadoop (HDFS, Map-Reduce) & eco system tools.
  • Capable to communicate & understand the concerns of Applications, UI, SQL developers, & Management and work towards a solution.
  • Implementation of data warehouse in Hadoop eco-system.
  • Completely involved with requirement and analysis phase.
  • Experience of working with MSBI tools such as SSIS, SSRS and SSAS.
  • Coordinated with different backend teams to gather log information for data analysis.
  • Maintained design structure (Schema) by directing implementation of SQL standards and guidelines.
  • Tested all stored procedures on testing server.
  • Proficiency in programming languages:  R, Python, T-SQL, C#, ASP.NET, ADO.NET, core Java.
  • Proficiency in Databases: MS SQL Server2000/2005/2008/2012.
  • Proficiency in ETL tools: SSIS and DTS.
  • Proficiency in Reporting tools: SQL server reporting server (SSRS).
  • Proficiency in Hadoop technologies: HDFS, PIG, HIVE, Sqoop, HBase, Mahout, and Oozie.
  • Successfully migrated data from SQL Server 2005 to SQL Server 2008 and SQL Server 2008 to SQL Server 2012.
  • Created log shipping on servers for disaster recovery.
  • Executed back up database task, check database integrity, rebuild indexes, update statistics task for maintenance.
  • Used Team Foundation Server (TFS) and GIT for version control.
  • Documentation for all kinds of reports and SSIS packages.
  • Worked on job failures, Probe alerts.
  • Worked on devoloper technologies such as GIT, Jenkins, and JIRA.
  • Ability to adapt to new technologies, and in improving the existing processes.

Desired Experience

  • Experience in data base admin and development.
  • Knowledgeable in Continuous Integration
  • Knowledge of C# and core Java.
  • Exposure to HTML, Xml and Java script
  • Experience in managing Data Driven Frame works
  • Knowledge of Regression and Exploratory testing
  • Experience of Performance testing to meet QA standards

Skill Sets Required

  • Ability to initiate and drive projects to completion with minimal guidance
  • The ability to communicate the results of analyses in a clear and effective manner
  • Basic understanding of statistical analysis.
  • Experience with large data sets and distributed computing (Hive/Hadoop) a plus.
  • Fluency in SQL or other programming languages. Some development experience in at least one scripting language (PHP, Python, Perl, etc.)
  • Programming languages:  R, Python, T-SQL, C#, ASP.NET, ADO.NET, core Java.
  • Adaptable and keen to learn new data transformation tools and methodologies.
  • Previous experience of working in latest technologies such as Hadoop.
  • Experience in utilizing Jira or similar web-based test/defect management tools
  • Trace record and resolve the bugs during test execution.
  • Good experience with Microsoft Office 365, Microsoft Windows Operating systems.
  • Extensive background within IT Business requirement gathering, data analysis, data transformation skills with MSBI.
  • Hadoop technologies: HDFS, PIG, HIVE, Sqoop, HBase, Mahout, and Oozie.
  • Experience of undertaking Business Definition Documentation and creating detail design document with flow charts.
  • Candidate should have excellent communication (written and verbal) and client engagement skills.
  • Preparation of appropriate documentation of major changes to data bases for future reference.
  • Candidate should be ready to deliver high quality outputs in tight timelines.

Education Level / Qualifications

  • Bachelor’s Degree in computer science
  • Preferred - Masters degree

Travel - Occasionally within UK and abroad


Last date to apply: 20th October 2016