c03c63ba1a
Larbin is a web crawler (also called (web) robot, spider, scooter...). It is intended to fetch a large number of web pages to fill the database of a search engine. With a network fast enough, Larbin should be able to fetch more than 100 millions pages on a standard PC. from Giovanni Bechis <g.bechis@snb.it> with tweaks by me and ajacoutot@ ok ajacoutot@
13 lines
463 B
Plaintext
13 lines
463 B
Plaintext
$OpenBSD: patch-src_global_cc,v 1.1.1.1 2007/05/07 11:17:07 jasper Exp $
|
|
--- src/global.cc.orig Mon May 7 12:43:37 2007
|
|
+++ src/global.cc Mon May 7 12:43:38 2007
|
|
@@ -84,7 +84,7 @@ int global::IPUrl = 0;
|
|
* Everything is read from the config file (larbin.conf by default)
|
|
*/
|
|
global::global (int argc, char *argv[]) {
|
|
- char *configFile = "larbin.conf";
|
|
+ char *configFile = "!!SYSCONFDIR!!/larbin/larbin.conf";
|
|
#ifdef RELOAD
|
|
bool reload = true;
|
|
#else
|