Big Data Architect
- Recruiter
- Vichag LTD
- Location
- B28 9HH, Birmingham
- Salary
- £36,000 to £42,000 Per Annum
- Posted
- 12 Apr 2017
- Closes
- 10 May 2017
- Ref
- BDA/CBLL/120417
- Sectors
- IT, IT Support, Programming, Software Development
- Contract Type
- Permanent
- Hours
- Full Time
Introduction
CBL Limited provides IT solutions, analytics, data management services and web solutions to support critical decision-making in Healthcare, Finance and Insurance sectors.
The company philosophy is to deliver high quality and robust systems in tight timelines with success being measured by safe, sustained and high-quality delivery. We believe to work in a congenial and proactive environment where constant knowledge sharing provides the opportunity for significant learning and skill-set development.
Currently we are looking for Big Data Architect for our UK operations.
Key Responsibilities
As a Big Data Architect within the team, you will be given an understanding of the company’s IT products and services. You will be part of an energetic team continuously engaged in improving the stack of offerings with focus on maintaining and enhancing our suite of Business Intelligence (BI) products and programming.
You will be responsible for the following:
- Infrastructure designing, migration and Implementation on Big Data.
- Providing scalable, secure, reliable and durable solutions.
- Translating requirements into build activities.
- Shape solutions with key stakeholders and colleagues.
- Validate the reference architecture and provide details for the architecture of platforms, leading and working hands-on towards implementation and delivery to production for Hadoop, Kafka platform
- Collaborate with other tech leads to ensure integrated end-to-end design.
- Define and document technical approach and design.
- Provide technical expertise.
- Control quality throughout the project life cycle.
- Ensure best practice leading by example.
- Support system and UAT testing.
- Control & Monitor production implementations.
- Document solutions fully to support on-going development and service.
- Learn new technologies to support systems delivery.
Skill Set Required
- Cloud and Programming environments – You will have worked to designing, development and migration project in multiple paradigms and have demonstrable experience of complex cloud based infrastructure solution and implementation. You will have demonstrable experience as Big Data architect or cloud developer with working knowledge of Bash Scripting & Python.
- Appreciation of data – You will have worked with meaningful data previously in a mission critical environment and be familiar with the techniques employed to ensure that this content is always of high quality.
- Working environment – You will be comfortable attacking complex challenges in many ways, as a team member.
- Quality and security – You will understand, respect and reinforce the importance of quality processes in the product development process. You will have demonstrable experience of using your initiative to improve the quality, robustness and accuracy of deliverables.
- Inter-Personal Skills – You should have demonstrable communication skills, problem-solving and “can-do” attitude.
Essential Experience
- Hands-on experience with the Hadoop stacks like Industry Big Data Frameworks (e.g Hortonworks , Cloudera, Kafka , Spark, Storm, Sqoop , Pig , Hive , Flume)
- Comprehensive experience designing and implementing solutions built upon tools within the Hadoop ecosystem
- Experience as Big Data Architect or Big Data Developer
- Programming Languages in Java is preferred.
- Experience in scripting languages like Python, Ruby, Scala will be advantage
- Proven experience of Databases - Oracle, MySQL, Teradata
- Strong understanding & Knowledge of NoSQL Databases - MongoDB, Cassandra.
- Automation of deployment, customization, upgrades and monitoring through DevOps tools.
- Knowledge and Experience on Cloud Environments - AWS or Azure Cloud
- A strong understanding of development lifecycles using waterfall and agile methodology
- An analytical and systematic approach to problem solving
- Excellent oral and written communication skills
Desired Experience
- Unix or Linux Knowledge
- XML/YAML/JSON
- Low level programming languages
- Training Skills
- Certification in Hadoop Framworks, Hotonworks, Cloudera, Cassandra
Education Qualifications
- Minimum – Bachelor’s Degree in Engineering.
- Preferred – Master’s Degree.
Travel - Occasionally within UK and abroad
Last date to apply: 10th May 2017