Loading...

Senior Web Crawler & Data Extraction Engineer

Home / Netgraph Networking

website-design.png

Senior Web Crawler & Data Extraction Engineer

Data Analyst

Netgraph Networking , Jaipur

About Us


Job Title: Senior Web Crawler & Data Extraction Engineer


Job Type: Full-Time


Summary

To enhance user profiling and risk assessment, we are building web crawlers to collect relevant user data from third-party sources, forums, and the dark web. We are seeking a Senior Web Crawler & Data Extraction Engineer to design and implement these data collection solutions.


Job Responsibilities

  • Design, develop, and maintain web crawlers and scrapers to extract data from open web
    sources, forums, marketplaces, and the dark web.
  • Implement data extraction pipelines that aggregate, clean, and structure data for fraud
    detection and risk profiling.
  • Use Tor, VPNs, and other anonymization techniques to safely crawl the dark web while
    avoiding detection.
  • Develop real-time monitoring solutions for tracking fraudulent activities, data breaches, and
    cybercrime discussions.
  • Optimize crawling speed and ensure compliance with website terms of service, ethical
    standards, and legal frameworks.
  • Integrate extracted data with fraud detection models, risk scoring algorithms, and
    cybersecurity intelligence tools
    .
  • Work with data scientists and security analysts to develop threat intelligence dashboards from collected data.
  • Implement anti-bot detection evasion techniques and handle CAPTCHAs using AI-driven
    solvers where necessary.
  • Stay updated on OSINT (Open-Source Intelligence) techniques, web scraping best practices,
    and cybersecurity trends
    .

Requirements

  • 5+ years of experience in web crawling, data scraping, or cybersecurity data extraction.
  • Strong proficiency in Python, Scrapy, Selenium, BeautifulSoup, Puppeteer, or similar
    frameworks
    .
  • Experience working with Tor, proxies, and VPNs for anonymous web scraping.
  • Deep understanding of HTTP protocols, web security, and bot detection mechanisms.
  • Experience parsing structured and unstructured data from JSON, XML, and web pages.
  • Strong knowledge of database management (SQL, NoSQL) for storing large-scale crawled
    data.
  • Familiarity with AI/ML-based fraud detection techniques and data classification methods.
  • Experience working with cybersecurity intelligence sources, dark web monitoring, and
    OSINT tools
    .
  • Ability to implement scalable, distributed web crawling architectures.
  • Knowledge of data privacy regulations (GDPR, CCPA) and ethical data collection practices.

Nice to Have

  • Experience in fintech, fraud detection, or threat intelligence.
  • Knowledge of natural language processing (NLP) for analyzing cybercrime discussions.
  • Familiarity with machine learning-driven anomaly detection for fraud prevention.
  • Hands-on experience with cloud-based big data solutions (AWS, GCP, Azure, Elasticsearch, Kafka).
,
,
,
Full Time, Permanent
IT & Information Security - Other
Education
Any Graduate
Key Skills
Skills highlighted with ‘‘ are preferred keyskills

location-iconJob Location

Jaipur

course-mode-iconRelevant Jobs

Job Location

Jaipur

Information
position
Position

Data Analyst

job-type
Job Type
Experienced
workplace
Type of workplace
on-site
languages0
English Languages
Basic
languages0

Languages

Salary Range

(Based On Interview)

Talento-logo
Download App Now