--
Net::Ping::External is a module which interfaces with the ping(1)
command on the system. It presently provides a single function,
ping(), that takes in a hostname and (optionally) a timeout and
returns true if the host is alive, and false otherwise. Unless
you have the ability (and willingness) to run your scripts as
the superuser on your system, this module will probably provide
more accurate results than Net::Ping (bundled with the perl
base installation) will.
Submitted and maintained by Maurice Nonnekes <maurice@amaze.nl>
---
libpcap is a packet capturing library. It is used by all sorts of
networking diagnostic programs (like tcpdump and nmap).
py-libpcap is an interface to this library for Python.
WWW: http://sourceforge.net/projects/pylibpcap/
* Take care to set a sane PATH.
* Add set -e
* Copy example files into ${PREFIX}/share/examples/PORTNAME.
* Replace PKGNAME with P_NAME in INSTALL/DEINSTALL scripts, unclear
since it already exists in the Makefile with a different value.
* Change output of INSTALL/DEINSTALL to be more like other scripts
found in the tree(suggested by heko@).
* Add missing gdbm dependency.
Submitted by maintainer Nils Nordman <nino@nforced.com>.
Added code in both the client and the server to detect whether the
peer is an old version with the S1G bug. The server will refuse
to serve such clients, and the client will refuse updates from
such a server. In each case, an error message is printed with a
URL that describes the bug and the upgrade procedure.
Resolv allows a user to resolve the names of a single IP address or
entire network of addresses to maintain a "map" of the names that
comprise a certain network.
The crawl utility starts a depth-first traversal of the web at the
specified URLs. It stores all JPEG images that match the configured
constraints. Crawl is fairly fast and allows for graceful termination.
After terminating crawl, it is possible to restart it at exactly the
same spot where it was terminated. Crawl keeps a persistent database
that allows multiple crawls without revisiting sites.
The main reason for writing crawl was the lack of simple open source
web crawlers. Crawl is only a few thousand lines of code and fairly
easy to debug and customize.
Features
+ Saves encountered JPEG images
+ Image selection based on regular expressions and size contraints
+ Resume previous crawl after graceful termination
+ Persistent database of visited URLs
+ Very small and efficient code
+ Supports robots.txt
--
TightVNC is an enhanced version of VNC, which is optimized to work over
slow network connections such as low-speed modem links. While original
VNC may be very slow when your connection is not fast enough, with
TightVNC you can work remotely almost in real time in most environments.
Besides bandwidth optimizations, TightVNC also includes many other
improvements, optimizations and bugfixes over VNC. Note that TightVNC is
free, cross-platform and compatible with the standard VNC.
WWW: http://www.tightvnc.org/
Submitted by Rob Casey <rob@minauros.com>