YipitData Logo

YipitData

Web Scraping Engineer

Posted 10 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in India
Mid level
Remote
Hiring Remotely in India
Mid level
Design, refactor, and maintain web scrapers, implement advanced scraping techniques, collaborate with teams, monitor performance, and drive continuous improvement.
The summary above was generated by AI

About YipitData:

YipitData is the leading market research and analytics firm for the disruptive economy and recently raised up to $475M from The Carlyle Group at a valuation over $1B.

We analyze billions of alternative data points every day to provide accurate, detailed insights on ridesharing, e-commerce marketplaces, payments, and more. Our on-demand insights team uses proprietary technology to identify, license, clean, and analyze the data many of the world’s largest investment funds and corporations depend on.

For three years and counting, we have been recognized as one of Inc’s Best Workplaces. We are a fast-growing technology company backed by The Carlyle Group and Norwest Venture Partners. Our offices are located in NYC, Austin, Miami, Denver, Mountain View, Seattle, Hong Kong, Shanghai, Beijing, Guangzhou, and Singapore. We cultivate a people-centric culture focused on mastery, ownership, and transparency.

Why You Should Apply NOW:

  • High Impact: Your work will directly influence key reports and strategic decisions across multiple business units.
  • Exciting Challenges: Tackle the design of resilient web scrapers, navigate dynamic website structures, and optimize large-scale data extraction.
  • Growth Opportunities: As an early member of our expanding Data Solutions team, you will have significant input on our strategies, processes, and team culture.

About The Role:

We are seeking a Web Scraping Engineer [Official, Internal Title: Data Solutions Engineer] to join our growing Data Solutions team. Reporting directly to the Data Solutions Engineering Manager, you will play a pivotal role in designing, refactoring, and maintaining the web scrapers that power critical reports across our organization. Your contributions will ensure our data ingestion processes are resilient, efficient, and scalable, directly supporting multiple business units and products.

As Our Data Solutions Engineer You Will:

Refactor and Maintain Web Scrapers

  • Overhaul existing scraping scripts to improve reliability, maintainability, and efficiency.
  • Implement best coding practices (clean code, modular architecture, code reviews, etc.) to ensure quality and sustainability.

Implement Advanced Scraping Techniques

  • Utilize sophisticated fingerprinting methods (cookies, headers, user-agent rotation, proxies) to avoid detection and blocking.
  • Handle dynamic content, navigate complex DOM structures, and manage session/cookie lifecycles effectively.

Collaborate with Cross-Functional Teams

  • Work closely with analysts and other stakeholders to gather requirements, align on targets, and ensure data quality.
  • Support internal users of our web scraping tooling by providing troubleshooting, documentation, and best practices to ensure efficient data usage for critical reporting.

Monitor and Troubleshoot

  • Develop robust monitoring solutions, alerting frameworks to quickly identify and address failures.
  • Continuously evaluate scraper performance, proactively diagnosing bottlenecks and scaling issues.

Drive Continuous Improvement

  • Propose new tooling, methodologies, and technologies to enhance our scraping capabilities and processes.
  • Stay up to date with industry trends, evolving bot-detection tactics, and novel approaches to web data extraction.

This is a fully-remote opportunity based in India. Standard work hours are from 11am to 8pm IST, but there is flexibility here.

You Are Likely To Succeed If:

  • Effective communication in English with both technical and non-technical stakeholders.
  • 4+ years of experience with web scraping frameworks (e.g., Selenium, Playwright, or Puppeteer).
  • Strong understanding of HTTP, RESTful APIs, HTML parsing, browser rendering, and TLS/SSL mechanics.
  • Expertise in advanced fingerprinting and evasion strategies (e.g., browser fingerprint spoofing, request signature manipulation).
  • Deep experience managing cookies, headers, session states, and proxy rotations, including the deployment of both residential and data center proxies.
  • Experience with logging, metrics, and alerting to ensure high availability.
  • Troubleshooting skills to optimize scraper performance for efficiency, reliability, and scalability.

What We Offer:

Our compensation package includes comprehensive benefits, perks, and a competitive salary: 

  • We care about your personal life and we mean it. We offer vacation time, parental leave, team events, learning reimbursement, and more!
  • Your growth at YipitData is determined by the impact that you are making, not by tenure, unnecessary facetime, or office politics. Everyone at YipitData is empowered to learn, self-improve, and master their skills in an environment focused on ownership, respect, and trust.

We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, marital status, disability, gender, gender identity or expression, or veteran status. We are proud to be an equal-opportunity employer.

Job Applicant Privacy Notice

Top Skills

Html Parsing
HTTP
Playwright
Puppeteer
Restful Apis
Selenium
Tls/Ssl

Similar Jobs

12 Hours Ago
Remote
Hybrid
Bengaluru, Karnataka, IND
Mid level
Mid level
Cloud • Fintech • Information Technology • Machine Learning • Software • App development • Generative AI
As a Software Engineer, you'll design and develop cloud-based data integration solutions, optimizing the application's performance and collaborating with teams to innovate new features while mentoring less experienced engineers.
Top Skills: AngularApi GatewayAWSAzureData IntegrationErp IntegrationEvent BusGCPIdentity ManagementJob SchedulerKubernetesMesosOdata ServicesRestRuby On RailsSearchSoapSQLWorkflow Orchestration
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
As a Senior Data Engineer, you will develop data models, build data pipelines, improve data quality, and support business insights by collaborating with analytics teams.
Top Skills: AirflowAWSPythonSparkSQL
13 Hours Ago
Remote
3 Locations
Mid level
Mid level
Artificial Intelligence • Automotive • Computer Vision • Information Technology • Internet of Things • Logistics • Software
The role involves leading the implementation of Varicent's compensation tool, optimizing sales compensation processes, and developing end-to-end solutions while collaborating with stakeholders.
Top Skills: ItilPythonSQLVaricent V10

What you need to know about the Kolkata Tech Scene

When considering the industries shaping India's tech scene, gaming might not immediately come to mind. However, in the last decade, increased internet usage and greater access to mobile devices have catapulted the industry to new heights, with Kolkata-based companies like Virtualinfocom, Red Apple Technologies and Digitoonz, at the forefront, driving the design and animation of new gaming titles for players.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account