Backend Engineer – Web Crawling Systems
Responsibilities:
- Design and develop distributed web crawling system architectures.
- Participate in the optimization of core crawling algorithms and strategies, including task scheduling and data collection efficiency.
- Monitor, optimize, and maintain system performance to ensure high availability, reliability, and stability.
Qualifications:
- Bachelor’s degree or equivalent practical experience.
- Three or more years of backend development experience; exceptional candidates may be considered with fewer years of experience.
- Proficient in Linux environments; strong experience with one or more backend languages such as Go, Python, or Node.js; solid understanding of asynchronous programming and multi-threading/multi-processing techniques.
- Strong understanding of web crawling principles and hands-on experience with common crawling frameworks and components.
- Experience with databases and messaging systems such as MySQL, Redis, Elasticsearch, and Kafka; experience in big data systems is a plus.
- Experience designing general-purpose web crawlers is a strong plus.
- Experience with anti-scraping detection and counter–anti-scraping techniques is a strong plus.
If you are interested in these job openings, please submit your resume and cover letter to shandahr@shanda.com. We also welcome assistance from recruitment agencies.