eSourcing - Big Data Engineer
 
 

Walmart is in an exciting phase of growth where are trying to Transform the Business by enabling multi-channel retailing to drive optimal performance by integrating merchandise management process into a single enterprise-wide information system

We're forming a new team to create an automated merchandizing & eSourcing platform, which is a single, integrated, centralized solution for retail requirements from merchandising through store operations, point-of-sale and workforce management. This will quickly enable retailers to drive sales, margins and cash flow by getting the right products to the right place at the right time.

We are a small agile team of highly motivated and smart people who are working on innovating to make the search/shopping experience on Wal-Mart’s online, mobile and physical stores more visual and intuitive. We are building out a great online product powered by unique big data mining technologies, which enable discovery-using algorithms that rank results via social signals from around the web.
Walmart processes billions of queries for millions of products on Walmart sites and apps worldwide. We mine structured and semi-structured data from product catalogs, social web, transactions, query logs, etc. at an unprecedented scale. We work on big data problems, cutting edge relevance algorithms from information retrieval, machine learning, and ranking to deliver a high-availability, low-latency service, which directly impacts business metrics

Big Data/Generalist Engineer
The Opportunity:
● help invent the next generation of ecommerce; integrated experiences that leverage the store, the web and mobile, with social identity being the glue
● work with world-class technologists and product visionaries as a contributing member of the @WalmartLabs core engineering team
● work with Ruby, RoR, Java and other related technologies to design and develop high-performance and scalable applications for use within the @WalmartLabs product ecosystem
● help the team leverage and contribute to open source software whenever possible (Cassandra, Gearman, Hadoop, etc.)
-Build, release, automate and configure developer-ready code and systems into production environments
-Design, develop, operate and administer multiple production systems and tools

-Work with Hadoop / MapReduce, Hive, HBase, Pig, Oozie and large-scale Analytics systems that are capable of ingesting, managing, storing and analyzing hundreds of terabytes of data

-be responsible for laying the foundation for the platform as well as proposing solutions to ease software development, monitoring of software, etc.
● be excited about making an immediate impact on a global scale

Responsibilities:

- Develop various facets of the Hadoop EDW ecosystem, real-time data processing, Map Reduce jobs, and Web Services

- Define development standards and design patterns for EDW Hadoop-based Development

- Prototype creative solutions quickly

- Provide timely and appropriate communication to business owners, stakeholders, and users on issue status and resolution

- Drive roadmap; manage priorities, projects, resources, issues and risks effectively

Qualifications:

- Passion for Big Data

- Strong on core Java server-side programming, shell scripting, and Python

- Strong development skills around Hadoop, Hive, Pig, HBase, Mahout/R, Map Reduce and Web Services

- Knowledge and proven experience on Object-oriented design, SOA, distributed computing, performance/scalability tuning, advanced data structures and algorithms, real time analytics and large scale data processing

- Experience with working in an Agile/SCRUM model

- Strong Background in Data warehousing principles, architecture and its implementation in large multi-terabyte environments – a big plus

Core Competencies:

- Strong customer service skills and excellent verbal and written communication skills

- Excellent problem solving and analytical skills

- Ability and desire to work in a fast-paced environment and stay motivated and flexible

- Ability to work cross functionally to resolve technical, procedural, and operational issues