Grab various details of interest from websites over HTTP/S connections through XPath/XQuery expressions, follow redirects, and save as XML or HTML.
- Version :0.9.6
- License :Trial
- OS :Windows All
- Publisher :Benito van der Zander
Most types of info can be easily accessed over the web, but there might not be an Internet connection everywhere. This is why in some cases one can download entire web pages for offline navigation, which can be incredibly helpful for research, especially if content is mostly text. This is also the purpose of Xidel, a command line tool for extracting data from web pages.
The application comes in a small package, and there’s no setup involved. It doesn’t come with its own interface, and needs to be used from a command line interface, which doesn’t necessarily have to be started with administrator privileges.
Data can be extracted with the help of several pattern-matching templates, such as XQuery/XPath expressions, as well as CSS 3 selectors. Moreover, it’s possible to extract over both HTTP and HTTPS connections, while Xidel can follow redirections, links, and previously extracted values.
Various switches can be used to define the process, and the help command displays all possible functions with related explanations. Extraction options can be set to include and exclude particular strings. Various types of items can be followed around the website and redirections.
One can choose to use Proxy connections over HTTP, as well as to include additional header info, or wait between requests. XML and HTML are among the supported export formats.