Source language: Translate to:

enhanced Internet functionality

Post your suggestions for future versions of NeoBook

Moderator: Neosoft Support

enhanced Internet functionality

Postby dpayer » Fri Jun 08, 2007 8:32 am

Currently the Internet section of commands includes:

CheckInternetConnection
ConnectToInternet
DisconnectfromInternet

Internetlink

Sendmail

Downloadfile
InternetFileExists
InternetFileSize
InternetPost
InternetGet

I would like to suggest (in the future) that the opensource products WGet and cURL be integrated into NB. The code is freely available and the license allows inclusion as long as access to the original code is made available to the recipient. This, of course, could be done by a plugin or as part of an enhanced new baseline Neosoftware product: This would give the following functions:


FEATURES of cURL http://curl.haxx.se/docs/features.html

curl tool
- config file support
- multiple URLs in a single command line
- range "globbing" support: [0-13], {one,two,three}
- multiple file upload on a single command line
- custom maximum transfer rate
- redirectable stderr

libcurl supports
- full URL syntax with no length limit
- custom maximum download time
- custom least download speed acceptable
- custom output result after completion
- guesses protocol from host name unless specified
- uses .netrc
- progress bar/time specs while downloading
- "standard" proxy environment variables support
- compiles on win32 (reported builds on 40+ operating systems)
- selectable network interface for outgoing traffic
- IPv6 support on unix and Windows
- persistant connections
- socks5 support
- supports user name + password in proxy environment variables
- operations through proxy "tunnel" (using CONNECT)
- supports large files (>2GB and >4GB) both upload/download
- replacable memory functions (malloc, free, realloc, etc)
- asynchronous name resolving (*6)
- both a push and a pull style interface

HTTP
- HTTP/1.1 compliant (optionally uses 1.0)
- GET
- PUT
- HEAD
- POST
- Pipelining
- multipart formpost (RFC1867-style)
- authentication: Basic, Digest, NTLM(*1), GSS-Negotiate/Negotiate(*3) and
SPNEGO (*4) to server and proxy
- resume (both GET and PUT)
- follow redirects
- maximum amount of redirects to follow
- custom HTTP request
- cookie get/send fully parsed
- reads/writes the netscape cookie file format
- custom headers (replace/remove internally generated headers)
- custom user-agent string
- custom referer string
- range
- proxy authentication
- time conditions
- via http-proxy
- retrieve file modification date
- Content-Encoding support for deflate and gzip
- "Transfer-Encoding: chunked" support for "uploads"

HTTPS (*1)
- (all the HTTP features)
- using client certificates
- verify server certificate
- via http-proxy
- select desired encryption
- force usage of a specific SSL version (SSLv2(*7), SSLv3 or TLSv1)

FTP
- download
- authentication
- kerberos4 (*5)
- active/passive using PORT, EPRT, PASV or EPSV
- single file size information (compare to HTTP HEAD)
- 'type=' URL support
- dir listing
- dir listing names-only
- upload
- upload append
- upload via http-proxy as HTTP PUT
- download resume
- upload resume
- custom ftp commands (before and/or after the transfer)
- simple "range" support
- via http-proxy
- all operations can be tunneled through a http-proxy
- customizable to retrieve file modification date
- no dir depth limit

FTPS (*1)
- implicit ftps:// support that use SSL on both connections
- explicit "AUTH TSL" and "AUTH SSL" usage to "upgrade" plain ftp://
connection to use SSL for both or one of the connections

SCP (*8)
- both password and public key auth

SFTP (*8)
- both password and public key auth
- with custom commands sent before/after the transfer

TFTP
- download / upload

TELNET
- connection negotiation
- custom telnet options
- stdin/stdout I/O

LDAP (*2)
- full LDAP URL support

DICT
- extended DICT URL support

FILE
- URL support
- "uploads"
- resume

FOOTNOTES
=========

*1 = requires OpenSSL, GnuTLS, NSS or yassl
*2 = requires OpenLDAP
*3 = requires a GSSAPI-compliant library, such as Heimdal or similar.
*4 = requires FBopenssl
*5 = requires a krb4 library, such as the MIT one or similar.
*6 = requires c-ares
*7 = requires OpenSSL or NSS, as GnuTLS only supports SSLv3 and TLSv1
*8 = requires libssh2


WGET FEATURES - http://www.gnu.org/software/wget/manual/html_node/Overview.html#Overview

- Wget can follow links in html and xhtml pages and create local versions of remote web sites, fully recreating the directory structure of the original site. This is sometimes referred to as “recursive downloading.”

here is the help text:

GNU Wget 1.10, a non-interactive network retriever.
Usage: wget [OPTION]... [URL]...

Mandatory arguments to long options are mandatory for short options too.

Startup:
-V, --version display the version of Wget and exit.
-h, --help print this help.
-b, --background go to background after startup.
-e, --execute=COMMAND execute a `.wgetrc'-style command.

Logging and input file:
-o, --output-file=FILE log messages to FILE.
-a, --append-output=FILE append messages to FILE.
-d, --debug print lots of debugging information.
-q, --quiet quiet (no output).
-v, --verbose be verbose (this is the default).
-nv, --no-verbose turn off verboseness, without being quiet.
-i, --input-file=FILE download URLs found in FILE.
-F, --force-html treat input file as HTML.
-B, --base=URL prepends URL to relative links in -F -i file.

Download:
-t, --tries=NUMBER set number of retries to NUMBER (0 unlimits).
--retry-connrefused retry even if connection is refused.
-O, --output-document=FILE write documents to FILE.
-nc, --no-clobber skip downloads that would download to
existing files.
-c, --continue resume getting a partially-downloaded file.
--progress=TYPE select progress gauge type.
-N, --timestamping don't re-retrieve files unless newer than
local.
-S, --server-response print server response.
--spider don't download anything.
-T, --timeout=SECONDS set all timeout values to SECONDS.
--dns-timeout=SECS set the DNS lookup timeout to SECS.
--connect-timeout=SECS set the connect timeout to SECS.
--read-timeout=SECS set the read timeout to SECS.
-w, --wait=SECONDS wait SECONDS between retrievals.
--waitretry=SECONDS wait 1..SECONDS between retries of a retrieval.
--random-wait wait from 0...2*WAIT secs between retrievals.
-Y, --proxy explicitly turn on proxy.
--no-proxy explicitly turn off proxy.
-Q, --quota=NUMBER set retrieval quota to NUMBER.
--bind-address=ADDRESS bind to ADDRESS (hostname or IP) on local host.
--limit-rate=RATE limit download rate to RATE.
--no-dns-cache disable caching DNS lookups.
--restrict-file-names=OS restrict chars in file names to ones OS allows.
--user=USER set both ftp and http user to USER.
--password=PASS set both ftp and http password to PASS.

Directories:
-nd, --no-directories don't create directories.
-x, --force-directories force creation of directories.
-nH, --no-host-directories don't create host directories.
--protocol-directories use protocol name in directories.
-P, --directory-prefix=PREFIX save files to PREFIX/...
--cut-dirs=NUMBER ignore NUMBER remote directory components.

HTTP options:
--http-user=USER set http user to USER.
--http-password=PASS set http password to PASS.
--no-cache disallow server-cached data.
-E, --html-extension save HTML documents with `.html' extension.
--ignore-length ignore `Content-Length' header field.
--header=STRING insert STRING among the headers.
--proxy-user=USER set USER as proxy username.
--proxy-password=PASS set PASS as proxy password.
--referer=URL include `Referer: URL' header in HTTP request.
--save-headers save the HTTP headers to file.
-U, --user-agent=AGENT identify as AGENT instead of Wget/VERSION.
--no-http-keep-alive disable HTTP keep-alive (persistent connections).
--no-cookies don't use cookies.
--load-cookies=FILE load cookies from FILE before session.
--save-cookies=FILE save cookies to FILE after session.
--keep-session-cookies load and save session (non-permanent) cookies.
--post-data=STRING use the POST method; send STRING as the data.
--post-file=FILE use the POST method; send contents of FILE.

HTTPS (SSL/TLS) options:
--secure-protocol=PR choose secure protocol, one of auto, SSLv2,
SSLv3, and TLSv1.
--no-check-certificate don't validate the server's certificate.
--certificate=FILE client certificate file.
--certificate-type=TYPE client certificate type, PEM or DER.
--private-key=FILE private key file.
--private-key-type=TYPE private key type, PEM or DER.
--ca-certificate=FILE file with the bundle of CA's.
--ca-directory=DIR directory where hash list of CA's is stored.
--random-file=FILE file with random data for seeding the SSL PRNG.
--egd-file=FILE file naming the EGD socket with random data.

FTP options:
--ftp-user=USER set ftp user to USER.
--ftp-password=PASS set ftp password to PASS.
--no-remove-listing don't remove `.listing' files.
--no-glob turn off FTP file name globbing.
--no-passive-ftp disable the "passive" transfer mode.
--retr-symlinks when recursing, get linked-to files (not dir).
--preserve-permissions preserve remote file permissions.

Recursive download:
-r, --recursive specify recursive download.
-l, --level=NUMBER maximum recursion depth (inf or 0 for infinite).
--delete-after delete files locally after downloading them.
-k, --convert-links make links in downloaded HTML point to local files.
-K, --backup-converted before converting file X, back up as X.orig.
-m, --mirror shortcut option equivalent to -r -N -l inf -nr.
-p, --page-requisites get all images, etc. needed to display HTML page.
--strict-comments turn on strict (SGML) handling of HTML comments.

Recursive accept/reject:
-A, --accept=LIST comma-separated list of accepted extensions.
-R, --reject=LIST comma-separated list of rejected extensions.
-D, --domains=LIST comma-separated list of accepted domains.
--exclude-domains=LIST comma-separated list of rejected domains.
--follow-ftp follow FTP links from HTML documents.
--follow-tags=LIST comma-separated list of followed HTML tags.
--ignore-tags=LIST comma-separated list of ignored HTML tags.
-H, --span-hosts go to foreign hosts when recursive.
-L, --relative follow relative links only.
-I, --include-directories=LIST list of allowed directories.
-X, --exclude-directories=LIST list of excluded directories.
-np, --no-parent don't ascend to the parent directory.
User avatar
dpayer
 
Posts: 1384
Joined: Mon Apr 11, 2005 5:55 am
Location: Iowa - USA

Postby Neosoft Support » Fri Jun 08, 2007 10:58 am

Thanks for your very detailed suggestion. I have printed this post and will study it further when time permits.
NeoSoft Support
Neosoft Support
NeoSoft Team
 
Posts: 5605
Joined: Thu Mar 31, 2005 10:48 pm
Location: Oregon, USA


Return to NeoBook Suggestions

Who is online

Users browsing this forum: No registered users and 1 guest