23 July 2015

Telecom Support Company Jobs - Hadoop Developer ~ Ugandan Jobline Jobs







Organisation: Telecom

Support Company


Duty

Station:
Kampala, Uganda


NFT Consult, a leading HR Consultancy firm, seeks to recruit for its

valuable client who works with

telecommunications and big data space responsible for delivering scalable and

reliable systems and frameworks that cut across the product development

organization. From streaming billions of real time events to processing

multi-terabyte datasets, the goal is to provide effective solutions to

challenging problems.


Job Summary: The Hadoop

Developer will be in charge of designing and building applications using

procedural languages most recently in the Hadoop space. This person must be

comfortable explaining design concepts to customers.


Key Duties and Responsibilities: 

  • In

    charge of creating scalable and reliable systems for managing large

    amounts of data

  • Technically

    support other teams and developers who use our technology stack

  • In

    charge of developing against and extend Hadoop, Kafka, and other open

    source components

  • Participate

    in the pre- and post- sales process, helping both the sales and product

    teams to interpret customers’ requirements

  • Work

    closely with prospective customers’ technical resources to devise and

    recommend solutions based on the understood requirements

  • Keenly

    analyze complex distributed production deployments, and make

    recommendations to optimize performance

  • Ability

    to quickly learn and adapt in a demanding and rapidly changing environment

  • Work

    closely with Hortonworks’ teams at all levels to ensure rapid response to

    customer questions and project blockers


Qualifications, Skills and Experience: 

  • The

    ideal candidate for the Hadoop Developer vacancy should hold both an BS

    and MS in Computer Science or equivalent

  • Extensive

    java programming experience

  • Prior

    experience developing for UNIX/Linux

  • Ability

    to build large-scale high-performance Hadoop systems

  • Strong

    experience implementing software and/or solutions in the enterprise Linux

    or Unix environment

  • Prior

    experience with Cask Data Application Platform (CDAP)

  • Strong

    foundation in data structures, algorithms, and software design

  • Past

    exposure and experience in large systems design and development

  • Experience

    with MapReduce/Hadoop or other distributed systems

  • Additional

    experience with Python and Scala

  • Practical

    experience in Hive, HBASE, Hcatalog


All suitably qualified candidates should visit the web link below and

create a profile on the NFT Consult website by entering their e-mail addresses.

Please visit web link below and click Apply now if convinced you have the job

requirements. The applications/CVs should be prepared in English and submitted

in either MS Word or PDF format will be considered.












“+y+””}elseif(A==5)c+=’

  • ‘+w+”“+y+”

  • “elseif(A==6)c+=’

  • “+w+’
    ‘+u+”“+y+”


  • “elsec+=’

  • “+w+”

  • “}}s.innerHTML=c+=””+y;d.callBack()};randomRelatedIndex=h;showRelatedPost=g;j(d.homePage.replace(//$/,””)+”/feeds/posts/summary”+e+”?alt=json-in-script&orderby=updated&max-results=0&callback=randomRelatedIndex”)})(window,document,document.getElementsByTagName(“head”)[0]);


    //]]>




    0 comments:

    Post a Comment