Front-End Developer Front-End Developer
  • Services
    • Cloud Services
      • Strategy
      • Migration
      • Cloud Managed Services
    • Data services
      • Data Lakes
      • Data Engineering
      • BI Analytics
    • Internet of Things
    • Artificial Intelligence
    • Machine Learning
    • Professional
  • Industries
    • Telecommunications
    • Healthcare & Life Sciences
    • Financial Services
    • Media
    • Retail
    • Startup
    • Manufacturing
  • Resources
    • Case Studies
    • Blogs
  • About Us
  • Career
  • Contact
  • Services
    • Cloud Services
      • Strategy
      • Migration
      • Cloud Managed Services
    • Data services
      • Data Lakes
      • Data Engineering
      • BI Analytics
    • Internet of Things
    • Artificial Intelligence
    • Machine Learning
    • Professional
  • Industries
    • Telecommunications
    • Healthcare & Life Sciences
    • Financial Services
    • Media
    • Retail
    • Startup
    • Manufacturing
  • Resources
    • Case Studies
    • Blogs
  • About Us
  • Career
  • Contact
  •  

Career

Category: Career

Front-End Developer

Responsibilities

  • Interface with internal business partners, program & product managers, understanding requirements and delivering vendor integrations.
  • Add and maintain vendor library integrations for our site properties.
  • Manage and maintain privacy components for our site properties.
  • Adopt and define the standards and best practices in vendor integration/tag management including data integrity, validation, reliability, and documentation.
  • Troubleshoot issues related to 3rd party vendor integrations.
  • Analyze and solve problems at their root, stepping back to understand the broader context.
  • Learn and understand GoDaddy’s revenue tracking platform and know when, how, and which to use and which not to use.
  • Keep up to date with advances in technologies and run pilots to design reviews.
  • Continually improve ongoing processes, automating or simplifying self-service support where possible.
  • Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment.

Qualifications

  • Strong JavaScript experience required.
  • Good Node.JS experience required.
  • 7+ years of front-end development experience with a focus on writing performant and optimized code required. Advance knowledge is preferred.
  • Google and Social Ad Platform experience preferred.
  • Experience working with browser inspector and performance tools is preferred.
  • Strong organizational and multitasking skills with ability to balance competing priorities.
  • Strong attention-to-detail and organizational skills.
  • Excellent communication (verbal and written) and interpersonal skills and an ability to effectively communicate with both business and technical teams.

Apply HERE

Read More
Principal Data Engineer

Responsibilities

  • Interface with our Business Analytics & data science teams, gathering requirements and delivering complete BI solutions.
  • Mentor junior engineers.
  • Model data and metadata to support discovery, ad-hoc and pre-built reporting.
  • Design and implement data pipelines using cloud/on-premise technologies
  • Own the design, development, and maintenance of datasets our BA teams will use to drive key business decisions.
  • Develop and promote best practices in data engineering, including scalability, reusability, maintainability, and usability.
  • Tune and ensure compute performance by optimizing queries, databases, files, tables, and processes.
  • Analyze and solve problems at their root, stepping back to understand the broader context.
  • Partner with a data platform team to establish SLAs and resolve any data/process issues.
  • Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume using AWS.
  • Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for datasets.
  • Triage many possible courses of action in a high-ambiguity environment, making use of both quantitative analysis and business judgment.

  Qualifications

  • Bachelor’s degree in CS or related technical field.
  • 10+ years of experience in data architecture and business intelligence
  • 3 + years of experience in developing solutions in distributed technologies such as Hadoop, hive and spark
  • Experience in delivering end to end solutions using AWS services – S3, RDS, Kinesis, Glue, Redshift & EMR
  • Expert in data modeling, metadata management, and data quality.
  • SQL performance tuning.
  • Strong organizational and multitasking skills with ability to balance competing priorities.
  • Experience in programming using Python, Java or Scala
  • Excellent communication (verbal and written) and interpersonal skills and an ability to effectively communicate with both business and technical teams.
  • An ability to work in a fast-paced environment where continuous innovation is occurring and ambiguity is the norm.
  • Experience in SQL Server BI stack and with a  BI reporting tool
  • Experience migrating from an on-prem to cloud data platform will be a plus

Apply HERE

Read More
DEVOPS ENGINEER

Responsibilities:

  • Work with data and analytics experts to strive for greater functionality in our data systems and can install Airflow from scratch and configure and maintain administer it.
  • The engineer should be an independent guy and should keep up to date with the Airflow open-source community enhancements. He should work with other team members and communicate on planning to make sure nothing gets impacted because of the installation or configuration of Airflow
  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Work with stakeholders including the Product, Data and Analytic teams to assist with data-related technical issues and support their data infrastructure needs.
  • Implement CI/CD pipelines

Requirements:

  • Experience with building data pipelines using EC2, EMR, RDS, Redshift, Spark, Python
  • Understands data and ability to discuss with analysts and translate to material that data engineers can easily understand
  • Understands and ability to develop data models for traditional relational and distributed technologies
  • Experience with SQL and AWS
  • Experience with engineering process (SDLC, data eng cycle, DataOps, DevOps)
  • Effective in driving meetings, discussions
  • Good project planning, execution skills

Apply HERE

Read More
Technical Program Manager

Responsibilities:

  • Drive efficiency, effectiveness and results!!!
  • Create epics/stories/requirements so that the developers can act on; Maintain the backlog
  • Plan and execute large projects and fit it into incremental delivery model
  • Run scrum ceremonies (grooming, planning, standups and retros)
  • Run requirements, design and code reviews
  • Partner will analytics and other technology teams
  • Be responsible for deliverables
  • Contribute to prioritization, architectural decisions, approach selection and data modelling activities
  • Evangelize the data stack and solutions delivered across analytic partners
  • Manage relationships with stake holders, understand the requirements, prepare roadmap and set expectations
  • Provide operational metrics
  • Provide metrics on team performance (velocity etc.)

Requirements:

  • Experience with Data Solutions using Hive, Spark, S3, EMR, Airflow, Python
  • Understands data and ability to discuss with analysts and translate to material that data engineers can easily understand
  • Understands and ability to develop data models for traditional relational and distributed technologies
  • Experience with Rest APIs and strong SQL
  • Experience with engineering process (SDLC, data eng cycle, DataOps, DevOps)
  • Effective in driving meetings, discussions
  • Good project planning, execution skills

Apply HERE

Read More

Latest Blogs

  • Cloud App Security – An improved way of augmenting security
  • Top IoT Security Challenges
  • IoT Security concerns & Risks: Is it the Internet of uncertain Things?
The VirtueTech Difference

We are resourceful thought leaders & problem solvers. Our culture of curiosity, holistic approach and learning is what makes us unique. When you work with us, we become a seamless extension of your company. We leverage our knowledge, connections, and best practices to help you transform your business.

Services
  • CLOUD SERVICES
  • DATA SERVICES
  • INTERNET OF THINGS
  • AI | ML
  • PROFESSIONAL
Industries
  • TELECOMMUNICATIONS
  • HEALTHCARE & LIFE SCIENCE
  • FINANCIAL SERVICES
  • MEDIA | RETAIL | STARTUP
  • MANUFACTURING
About Us
  • ABOUT VIRTUETECH
  • CAREER
  • CONTACT US
  • CASE STUDIES
  • BLOGS
2020 © copyrights VIRTUETECH | PRIVACY POLICY | DISCLAIMER