scraper

package
v0.2.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 12, 2025 License: MIT Imports: 14 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

View Source
var Cookies = map[string]string{
	"PHPSESSID":           "ae6doi5fo94hv5k2ouqmopd47k",
	"s2_csrf_cookie_name": "cf0b4574d2c27713afd4b26879597e5d",
	"s2_theme_ui":         "red",
	"s2_uGoo":             "w6a162dd67b1968e6349944bcff010fdd63ee724",
	"s2_uLang":            "en",
	"sh":                  "72",
	"sw":                  "95.4",
}

Cookies required for requests

View Source
var Headers = map[string]string{
	"Content-Type":     "application/x-www-form-urlencoded; charset=UTF-8",
	"X-Requested-With": "XMLHttpRequest",
	"Origin":           "https://myip.ms",
	"Referer":          "https://myip.ms/browse/sites/1",
	"User-Agent":       "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/132.0.0.0 Safari/537.36",
	"Accept":           "*/*",
}

Headers for HTTP requests

Functions

This section is empty.

Types

type HTTPClient

type HTTPClient struct {
	// contains filtered or unexported fields
}

HTTPClient represents an HTTP client with headers and cookies

func NewHTTPClient

func NewHTTPClient(proxyURL, proxyUser, proxyPass string) *HTTPClient

NewHTTPClient creates a new HTTP client with default cookies and optional proxy

func (*HTTPClient) Get

func (hc *HTTPClient) Get(url string) (*http.Response, error)

Get performs a GET request with the configured headers and cookies

func (*HTTPClient) Post

func (hc *HTTPClient) Post(url string, data url.Values) (*http.Response, error)

Post performs a POST request with the configured headers and cookies

type Runner

type Runner struct {
	// contains filtered or unexported fields
}

Runner orchestrates the scraping process

func NewRunner

func NewRunner(opts *options.Options) (*Runner, error)

NewRunner creates a new Runner with the given options

func (*Runner) Close

func (r *Runner) Close() error

Close closes the output file

func (*Runner) Run

func (r *Runner) Run() error

Run executes the scraping process

type Scraper

type Scraper struct {
	// contains filtered or unexported fields
}

Scraper holds the HTTP client and URL template for efficient scraping

func NewScraper

func NewScraper(httpClient *HTTPClient, filter *options.Filter) *Scraper

NewScraper creates a new scraper with the HTTP client and builds the URL template

func (*Scraper) FetchPage

func (s *Scraper) FetchPage(page int, includeFields map[string]bool) ([]*types.DomainData, error)

FetchPage fetches a single page and returns the domain data found

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL