The Canada2 Network is an "in process" internet development that has, to date, required years of dedicated research, discovery and experience. Officially launched on June 29, 1999, this co-creative dynamic community-orientated project is a 21stCentury social model for economization and community connectivity.

Vision Realization Updates:

On Earthday, April 22, 2002, the renewed Canada2 Network flagship, at http://www.toronto2.com was officially launched! Explore hundreds of topically- orientated newsfeeds, interactive community tools, and more.


Canada2's mission is to establish a nation-wide organization of online community centres dedicated to enhancing life in Canada.

    The Flagship Website, Toronto2 is one of many websites that will provide services to every metropolitan centre and region across Canada. The "internetwork" is operated from central server platforms located in Toronto.
Canada2 proposes to offer every Canadian a permanent home on the web.
    In metropolitan centres commercial organizations are concerned that their livelihoods are threatened by web centred businesses. Organizations must consider the implications of losing their clientelle to these fledgling organizations. Small business, without adequate budgets, end up developing websites that may be become no more than financial liabilities. They realize however that if they do nothing they may fall further behind in the quest for a viable place in the new paradigm. We plan on providing individual Canadians with superior services while providing desperately needed solutions to this growing commercial problem.
Canada 2 plans to take the universal applicability of the internet and apply it locally.
    Our goal is to see the Canada 2 Network become a seamless service centre meeting the many needs of individual Canadians. To this end we now have in place our prototype community infrastructure (showcased online at www.Toronto2.com).
Canada2 will assist Canadians by providing locally focussed internet environments that facilitate both communication and services.

    In many ways the country we call "Canada" is the envy of the world!
    Our multi-cultural mosaic, our landbase, and the distinctiveness of our global identity as peacemakers are priceless attributes.
The Canada 2 Network creatively joins our diversities in age, interest, education, culture, gender, lifestyle, religion, vocation, activity and geographic location.
    The Canada2 Network brings Canadians together in cyberspace with the viable intent of real world contact. The Network provides an impressive array of online communication tools and programmes designed for local application however the Canada2 Network applies these tools to organize access to the "real world".

    FOR EXAMPLE

  • Online appointment, housecall, and delivery services.
  • Virtual gatherings around common interests leading to real world meetings and benificient initiatives.
  • Special community website domains for every major metropolitan and geographic region and facet of Canada
  • Communication platforms, including instant messenger services, group chat rooms, discussion and message forums, and online conference rooms.
  • A nation wide retailer's marketplace that looks to balancing the distribution of supply and demand across Canada while bringing the best prices, variety, and informed choices to Canadian consumers.
  • Proud to be Canadian? Canada 2 offers each individual Canadian a free, permanent and private email address at the metropolitan cities or regions of their choice.
  • Canadians are also offered free personal homesites at metropolitan websites ( such as www.Toronto2.com, www.Vancouver2.com, www.Ottawa2.com ).

  • With this overall facilitation of 'world wide web traffic', along with an interactive demographic database of individual consumer preferences the Canada 2 Network uniquely positions itself as a mainstream ideal for sponsorship and web advertising. The revenues received from the natural and inherently distinctive draw of our city and country websites will benefit all Canadians through sponsoring the continued stewardship of our permanent homes on the world wide web.

    1) The Principle Developers
    Canadian "open source" developers are leading the way in building the Canada2 Network The cooperative stewardship process is co-ordinated entirely online through the Canada2.com virtual office. This dream of a permanent community Network will be realized because of the shared vision, dedication, expertise, and sense of social responsibility of the Canadian people.

    2) Flexibility and Adaptability
    Critical to success is a shared understanding of the fact that we are an open source environment. Our websites are community centres online that can be considered a shared privilege of the Canadian people. We therefore have allowed for growth pains, and in fact require them, to create a stronger sense of community.

    6)Transparency of Operation
    Creating a safe and desirable environment for both community and commerce. A co-operative approach leads to involvement, shared resources, and loyalty.

    More convenience, more transactional accountability, more alternatives, more communication, and more community spirit.

    In considering future objectives:

    The Canada 2 Network as envisioned is a network of 'web' channels that in presentation speak to viewers with the language of 'TV'. We expect Canada2 to become a significant force in the (soon to be) medium of interactive television. The fundamental community orientated infrastructure already in place will be positioned perfectly when the two technologies (TV and the Internet), for all practical purposes, become one.

The Canada2 Network through privately owned will be transferred into the responsible stewardship of the appropriate communities when fully operational.
    We also plan to use the Canada2 model to facilitate the continuance of the programme world wide. Together we have the potential to help create the global village!
    The Canada 2 Project is the culmination of concepts that began many years ago and have through time been nurtured into realization. The principle architects have a consistently proven track record for carrying out their projects with both creative vision and integrity.They are...
    Friendship Enterprises
    G.S.Cole and R.R. Sinclair
    Partners

    Garth Cole is a Supervising City of Toronto Information Officer who brings many years of experience speaking one-on-one with thousands of Torontonians about the 3 R's, the environment and their personal responsibility for the welfare of our community. Garth's expertise in public relations and community education foundation the "eye to eye", co-operative, common ground approach of the Canada2 Network.

    Ritchie Sinclair is the principle developer and the primary creative force behind the Canada2 Network. He is both an accomplished website designer and internet programmer. He has a lifetime of study and leadership in the arts along with a degree in design. He is the protege of Ontario's Ojibwa Grand Shaman and world reknowned Native artist, Norval Morrisseau.

Canada is a free country
Canada2 is a free community network
You are welcome to participate in any Canada2 website activities
whether you live in the region or not
Support, sponsorship, and input are always welcome.
This is your network, use it, share it, move in, start something!
Lets show the Global Village the Canadian Way to Community!

ABOUT - ART - CONTACT - HISTORY - HOME - MUSIC - NETMEDIA - PROJECTS - SEARCH - WRITING
© 1999 - 2006 Stardreamer Netmedia. All rights reserved.
#!/usr/bin/perl -- require 5; =item overview AXS Script Set, Logging Module Copyright 1997-2015 by Fluid Dynamics Please adhere to the copyright notice and conditions of use as described at the URL below. For latest version and help files, visit: https://www.xav.com/scripts/axs/

If you can see this text from a web browser, then there is a problem. Get help here.

The AXS proprietary log is pipe-delimited and newline separated. Each record contains null leading and trailing fields. The fields are: 0 NULL 1 resolved-ip-address 2 ip-address 3 from-url 4 to-url 5 browser-string 6 time-seconds 0..59 7 time-minutes 0..59 8 time-hour 0..23 9 day-of-month 1..31 10 month-of-year 0..11 11 year-1900 (i.e., 100=>2000, 103=>2003) 12 day-of-week 0..6, sunday=0, saturday=6 13 day-of-year 0..364 0==jan1, 364/65=dec31 14 export||'' bit field; contains literal word "export" if and only if this was a redirect 15 NULL/newline =cut my $VERSION = '2.3.0.0043'; # Enter the location of your log file relative to this script. This is path # and file name, not a web address. Leave as-is for a default install. my $LogFile = 'log.txt'; # Logging can be disabled after the log exceeds a certain size. To use this # feature, enter a non-zero number for the maximum byte size for your log # file. Leave it at zero to always log, without size restriction. my $MaxLogSize = 0; # Optional string to secure redirects; see https://www.xav.com/scripts/axs/help/1516.html my $redir_md5_secret = ''; # This script will not log visits from users with hostnames or IP addresses # listed below. Use all lowercase names. Empty the array to log everyone: my @IgnoreHosts = (); # Example: # # @IgnoreHosts = ('.foobar.org', 'host.example.co.uk', '250.245.240.'); # This maps hostnames to a consistent format; for example, if your site can # be addressed as http://xav.com/ and http://www.xav.com/ then this set of # mappings can convert all URL's to a consistent format. # # Format is: # Original-String, Final-String, # # The To and From web addresses will have a find-and-replace operation done # on them with each name-value pair in the %Maps hash. The operation will be # done as a case insensitive substring match. my %Maps = ( 'http://xav.com/' => 'http://www.xav.com/', 'http://ftp.xav.com/' => 'http://www.xav.com/', ); # Once the script is working to your satisfaction, set the $AllowDebug # variable to zero: my $AllowDebug = 1; # When this is set to 1, ax.pl will perform DNS lookups on unresolved # visitors (i.e., "140.140.58.1" becomes "anaconda.brooks.af.mil"). DNS # resolution is a sometimes slow and time-consuming process, and you can # improve speed by setting this to 0. my $resolve_dns_names = 1; my $use_ssi_detect = 1; # __________________________________________________________________ # # The following shouldn't need to be changed: my $domain = 'http://' . &query_env('SERVER_NAME','localhost'); # If your webserver doesn't support SERVER_NAME, then set this variable # as the top-level URL to your server without a trailing slash, e.g.: # # my $domain = 'http://www.xav.com'; # my $header = "Content-type: text/html\015\012\015\012"; # This should be deleted if the content-type header is being echoed out # to your SSI output, otherwise leave as is. # This variable allows you to correct for a different time zone if # your ISP is somewhere else. This is an integer of +/- a certain number # of hours. i.e., ISP is in Pennsylvania and owner is in Seattle: # $TimeOffsetInHours = -3; # ISP in Australia, owner in London: # $TimeOffsetInHours = +12; my $TimeOffsetInHours = 0; # If every visitor is being logged twice, try setting the following variable # to 1: my $NoLogHead = 0; # ___________________________________________________________________________ %::private = (); $::private{'PRINT_HTTP_STATUS_HEADER'} = 0; my %FORM = (); &WebFormL(\%FORM); my $Export = 0; if (($0 =~ m!^(.+)(\\|/)!) and ($0 !~ m!safeperl\d*!i)) { chdir($1); } # $mode is one of: # # ssi => server-side include call; no output # redir => redirect visitor to the URL given in nexturl # img => return a 1x1 pixel transparent gif # debug => returns debug print my $mode = $FORM{'mode'} || ''; # $ref is the full URL of the referring file. If not given, will query HTTP_REFERER my $ref = $FORM{'ref'} || $ENV{'HTTP_REFERER'} || ''; # $to is the full URL of the file being visited. If not given, will be pulled from various environment variables my $to = $FORM{'to'} || ''; if ($mode eq 'img') { $to = &query_env('HTTP_REFERER'); } my $nexturl = $FORM{'nexturl'} || ''; my $qs = &query_env('QUERY_STRING'); DetectMode: { # is the mode explicitly set? last if (($mode eq 'img') or ($mode eq 'redir')); # SSI call: if ($use_ssi_detect) { if ($ENV{'DOCUMENT_URI'}) { $mode = 'ssi' unless ($mode); unless ($to) { $to = $domain . $ENV{'DOCUMENT_URI'}; if ($ENV{'REDIRECT_QUERY_STRING'}) { $to .= '?' . $ENV{'REDIRECT_QUERY_STRING'}; } } last; } # Alternate SSI call (via REQUEST_URI not DOCUMENT_URI) if ($ENV{'REQUEST_URI'} and ($qs eq '')) { $mode = 'ssi' unless ($mode); unless ($to) { $to = $domain . $ENV{'REQUEST_URI'}; } last; } # Alt SSI call on Windows/IIS if ((&query_env('SERVER_SOFTWARE') =~ m!iis!i) and ($ENV{'PATH_INFO'} ne $ENV{'SCRIPT_NAME'})) { $mode = 'ssi' unless ($mode); unless ($to) { $to = $domain . $ENV{'SCRIPT_NAME'}; } last; } } # trans image logging: if ($qs =~ m!^(\w+)\.gif(\&ref=)?(.*)$!i) { $mode = 'img' unless ($mode); $ref = $3 if ($3); $to = &query_env('HTTP_REFERER'); last; } # redirect if (($qs) and ($qs ne 'debugme')) { $mode = 'redir' unless ($mode); $nexturl = $qs unless ($nexturl); $Export = 1; last; } if (lc($qs) eq 'debugme') { $mode = 'debug'; last; } } # provide output the user first, independent of logging action: my $b_actually_log = 1; if ($mode eq 'ssi') { print "$header\n \n"; } elsif ($mode eq 'img') { &Print_Image; } elsif ($mode eq 'redir') { $nexturl =~ s!^h\:!http://!; $nexturl =~ s!^hs\:!https://!; $to = $nexturl; $b_actually_log = &handle_redirect( $nexturl ); } elsif ($mode eq 'debug') { &SpawnDebugger; $b_actually_log = 0; } else { # we should never get here, this is just a valid HTTP response # in case of mis-configuration or whatever: print "HTTP/1.0 200 OK\015\012" if ($::private{'PRINT_HTTP_STATUS_HEADER'}); print $header; print "<p>$0 - working okay - no logging command received - use ?debugme query string for more info.</p>"; } # decide whether or not to log this visit: my $err = ''; Err: { last Err if $b_actually_log == 0; last Err if (&query_env('HTTP_COOKIE') =~ m!axs_no_log=1!); last Err if (($NoLogHead) and (&query_env('REQUEST_METHOD') eq 'HEAD')); my ($vhost, $vaddr) = &resolve_host($resolve_dns_names); my $ighost = ''; foreach $ighost (@IgnoreHosts) { $ighost = quotemeta($ighost); next unless ($ighost); last Err if ($vhost =~ m!$ighost!); last Err if ($vaddr =~ m!$ighost!); } # Note: you can filter on other things as well. If you want to ignore people # arriving from a certain site, like Yahoo, you can write the following (note # that HTTP_REFERER is used instead of REMOTE_HOST): # # @ignore = ('yahoo.com', 'av.yahoo.com'); # foreach (@ignore) { # exit if ($ENV{'HTTP_REFERER'} =~ m!$_!); # } # don't fill up the file system: my $LogSize = -s $LogFile || 0; last Err if (($MaxLogSize) and ($MaxLogSize < $LogSize)); # cleanse the data: my ($clean_url, $host, $port, $path, $is_valid) = &parse_url($ref); if ($is_valid) { $ref = $clean_url; } ($clean_url, $host, $port, $path, $is_valid) = &parse_url($to); if ($is_valid) { $to = $clean_url; } # Apply the mappings: foreach (keys %Maps) { $to =~ s!$_!$Maps{$_}!ig; $ref =~ s!$_!$Maps{$_}!ig; } &log_visit($vhost,$vaddr,$ref,$to); last Err; } sub Print_Image { print "HTTP/1.0 200 OK\015\012" if ($::private{'PRINT_HTTP_STATUS_HEADER'}); print "Pragma: no-cache\015\012"; print "Expires: Saturday, February 15, 1997 10:10:10 GMT\015\012"; print "Content-Type: image/gif\015\012\015\012"; binmode(STDOUT); foreach (71,73,70,56,57,97,1,0,1,0,128,255,0,192,192,192,0,0,0,33,249,4,1,0,0,0,0,44,0,0,0,0,1,0,1,0,0,1,1,50,0,59) { print pack('C',$_); } } # ___________________________________________________________________________ # This runs a filesystem test against $LogFile and dumps a ton of (hopefully) # useful information to the screen: sub SpawnDebugger { print "HTTP/1.0 200 OK\015\012" if $::private{'PRINT_HTTP_STATUS_HEADER'}; print "Content-Type: text/html\015\012\015\012"; unless ($AllowDebug) { print '<p><b>Error:</b> no output available because $AllowDebug = 0 in this script.</p>'; return 0; } my $filesys_test = ''; foreach ("$LogFile", "$LogFile.settings.pl") { if (-e $_) { my ($LogSize,$LastModT) = (stat($_))[7,9]; $LastModT = scalar localtime($LastModT); $filesys_test .= "<p>File <tt>$_</tt> exists with size $LogSize bytes. It was last modified on $LastModT. "; if (open(FILE,">>$_")) { binmode(FILE); close(FILE); $filesys_test .= "The file is writable.</p><p><font color=\"#008811\"><b>The filesystem test passed!</b></font></p>"; } else { $filesys_test .= <<"EOM"; However, the file is not writable. The filesystem returned <tt>"$!"</tt> when this script tried to write to it. You need to change the file permissions to make it script-writable.</p> <p><font color="#ff0000"><b>The filesystem test failed.</b></font></p> EOM } } elsif (open(FILE,">>$_")) { binmode(FILE); close(FILE); $filesys_test .= <<"EOM"; <p>File <tt>$_</tt> did not exist when this script started. However, this script attempted to create it for you, and the server responded that this was successful. So everything <i>should</i> be fine now. Reload this web page, and hopefully you will see a message that the file system test has passed. If it does not pass, and instead you get an error or you get this message again, then you will have to manually create the log file and set it's permissions.</p> <p><font color="#ff0000"><b>The filesystem test needs to be run again.</b></font> (reload this page)</p> EOM } else { $filesys_test .= <<"EOM"; <p>File <tt>$_</tt> doesn't exist. You need to create one and give it writable permissions. Alternately, the file may exist but the <tt>\$LogFile</tt> variable might not point to the correct location, in which case you will need to change your variable.</p> <p><font color="#ff0000"><b>The filesystem test failed.</b></font></p> EOM } } my $homelink = ''; my @ext = ('pl', 'cgi'); if ($0 =~ m!\.cgi$!) { @ext = ('cgi','pl'); } foreach (@ext) { my $file = 'ax-admin.' . $_; if (-e $file) { $homelink = qq!<p>Click here to return to <a href="$file">$file</a>.</p>\n!; last; } } my $cookie = &he($ENV{'HTTP_COOKIE'} || ''); my $cookie_info = ''; if ($cookie =~ m!axs_no_log=1!) { $cookie_info = "<p>Your visits <em>will NOT be logged</em> because the 'axs_no_log=1' cookie <em>was detected</em>.</p>\n"; } else { $cookie_info = "<p>Your visits <em>will be logged</em>, because the 'axs_no_log=1' cookie <em>was NOT detected</em>.</p>\n"; } my $ignore_host_info = ''; IgnoreHostInfo: { if (not @IgnoreHosts) { $ignore_host_info .= "<p>The <code>\@IgnoreHosts</code> array is empty. No logging overrides will occur due to IP address or hostname.</p>\n"; last; } my ($vhost, $vaddr) = &resolve_host($resolve_dns_names); $ignore_host_info .= "<p>The <code>\@IgnoreHosts</code> array contains:<br />\n"; my $b_ignored = 0; foreach (@IgnoreHosts) { $ignore_host_info .= "&nbsp;&nbsp;&nbsp;'$_'"; if ($_) { my $qm = quotemeta($_); if ($vhost =~ m!$qm!) { $ignore_host_info .= " <b>logging disabled for you because $vhost matches</b>\n"; $b_ignored = 1; } elsif ($vaddr =~ m!$qm!) { $ignore_host_info .= " <b>logging disabled for you because $vaddr matches</b>\n"; $b_ignored = 1; } } $ignore_host_info .= "<br />\n"; } $ignore_host_info .= "</p>"; if ($b_ignored) { $ignore_host_info .= "<p>Your client address ($vhost/$vaddr) will cause your visits to not be logged.</p>\n"; } else { $ignore_host_info .= "<p>Your client address ($vhost/$vaddr) does not match any of these entries. Logging will not be disabled based on <code>\@IgnoreHosts</code> values.</p>\n"; } last; } my $env_info = ''; foreach (sort keys %ENV) { my ($name, $value) = &he( $_, substr($ENV{$_},0,60) ); $env_info .= qq!<tr><td class="label">$name:</td><td>$value<br /></td></tr>\n!; } my $axpath = 'http://' . ( $ENV{'HTTP_HOST'} || $ENV{'SERVER_NAME'} || '' ) . $ENV{'SCRIPT_NAME'}; my $li_start_redir = '<li><div>'; my ($policy, $text, %allow_hosts) = &allow_redir_policy( "$LogFile.settings.pl" ); if ($policy eq 'off') { $li_start_redir = '<li><p>This functionality disabled because Redirect Policy is "OFF".</p><div style="text-decoration:line-through;">'; } print <<"EOM"; <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" /> <title>Debug Page and Usage Instructions</title> <meta name="robots" content="none" /> <style type="text/css"> <!-- body,div,p,table,tr,td,span { font-family:verdana,sans-serif; font-size:small; } .highlight { padding:10px; border:2px solid #cc0000; } .product_header { font-size:medium; font-weight:bold; } .copyright_footer { font-size:smaller; text-align:center; } tt,pre,code { font-family:monospace; font-size:small; } .indent { margin-left:40px; margin-right:40px; } td.label { text-align:right; font-weight:bold; } //--> </style> </head> <body> $homelink <p>Review the <a href="https://www.xav.com/scripts/axs/help/">AXS help file</a> if you need more help.</p> <p><b>Filesystem Test:</b></p> <div class="indent"> $filesys_test </div> <p><b>Usage Instructions:</b></p> <div class="indent"> <ol> <li> <p>Add this "AXS tracking code" to any HTML pages that you want to have tracked. This text should be placed within the <code>&lt;body&gt;</code> section of the document, towards the bottom of the page. You can place the text almost anywhere, so feel free to move it around if it causes problems with your layout:</p> <form method="get" action=""><textarea rows="10" cols="85">&lt;script type="text/javascript"&gt; &lt;!-- \tdocument.write('&lt;img src="$ENV{'SCRIPT_NAME'}?mode=img&amp;ref='); \tdocument.write( escape( document.referrer ) ); \tdocument.write('" height="1" width="1" style="display:none" alt="" /&gt;'); // --> &lt;/script&gt;&lt;noscript&gt; \t&lt;img src="$ENV{'SCRIPT_NAME'}?mode=img" height="1" width="1" style="display:none" alt="" /&gt; &lt;/noscript&gt;</textarea></form> <p>Note that this text only works on normal HTML pages, not in frameset documents.</p> </li> <li> <p>After entering that HTML code on your pages, transfer the pages up to the server. Then clear your browser cache and visit the pages. Use your browser's "view-source" command to look at the HTML source code. Confirm that the above Javascript appears in your pages *exactly* as it appears above. Make sure that the line breaks appear in the right places.</p> <p>This is important because some HTML editor programs will corrupt the text that you try to insert into your pages. You are responsible for entering the Javascript logging code correctly and for verifying that it appears correctly. If you do not do this, then the product will not work.</p> </li> $li_start_redir <p>Code your <em>off-site</em> links (links to pages/files that don't already contain the AXS tracking code) like this:</p> <pre>&lt;a href="$ENV{'SCRIPT_NAME'}?http://yahoo.com/"&gt;http://yahoo.com/&lt;/a&gt;</pre> <p>Here is an <a href="$ENV{'SCRIPT_NAME'}?http://www.yahoo.com/" target="_blank">example link</a>.</p> </div> </li> </ol> <p>If any of your HTML pages reside on a different website than AXS, then you should use:</p> <pre class="indent">$axpath</pre> <p>instead of:</p> <pre class="indent">$ENV{'SCRIPT_NAME'}</pre> <p>in the examples above.</p> </div> <p><b>Standard Debugging Information:</b></p> <div class="indent"> <p>This is AXS Logging Module version $VERSION in debug mode.<br /> The file name of this script is <tt>$0</tt>.<br /> This script is executing under Perl version $].<br /> The critical file system variable is <tt>\$LogFile = "$LogFile";</tt>.<br /> <tt>\$MaxLogSize = $MaxLogSize;</tt> (bytes)</p> </div> <p><b>Webmaster Logging Override</b></p> <div class="indent"> <p>You can disable the logging of your own visits by having the "axs_no_log=1" cookie, or by having your IP address or hostname present in the <code>\@IgnoreHosts</code> array.</p> <p>See <a href="https://www.xav.com/scripts/axs/help/1506.html" target="_blank">this help file</a> for more information about not tracking your own visits.</p> <p><b>Cookie Override</b></p> <p>Your browser sent the following cookie header:</p><pre>HTTP_COOKIE: $cookie</pre> $cookie_info <p><b>IP or Hostname Override</b></p> $ignore_host_info </div> <p><b>Environment Variables:</b></p> <table border="1" cellpadding="4" cellspacing="0" class="indent"> $env_info </table> <p><br /></p> <div class="copyright_footer"> The <a href="https://www.xav.com/scripts/axs/">AXS Visitor Tracking System</a> v$VERSION is &copy; 1997-2015 Fluid Dynamics Software </div> </body> </html> EOM } # Trim - thanks to William Boudreau for & fix sub Trim { local $_ = $_[0] ? $_[0] : ''; s!^[\r\n\s]+!!o; s![\r\n\s]+$!!o; return $_; } #changed 0033 -- no longer mapping // => / within the query string portion of the URL # fixed Google image search backtracking sub clean_path { my $path = &Trim($_[0]); # strip pound signs and all that follows (links internal to a page) $path =~ s!\#.*$!!; my ($base, $question, $query) = ($path, '', ''); if ($path =~ m!^(.*?)(\?)(.*)$!s) { ($base, $question, $query) = ($1, $2, $3); } local $_ = $base; # map /%7E to /~ (common source of duplicate URL's) s!\/\%7E!\/\~!ig; # map "/./" to "/" s!/+\./+!/!g; # map trailing "/." to "/" s!/+\.$!/!g; # nuke all leading "/../" entries (meaningless for us) # map /../foo => /foo while (s!^/+\.\./+!/!) {} # map "folder/../" => "/" # map "bar/folder/../" => "bar//" while (s!([^/]+)/+\.\./+!/!) {} # map "/folder/.." => "/" s!/+([^/]+)/+\.\.$!/!; # collapse back-to-back slashes in the path s!/+!/!g; return $_ . $question . $query; } sub parse_url { local $_ = $_[0] || ''; my ($clean_url, $host, $port, $path, $is_valid) = ('', '', 80, '/', 0); # add trailing slash if none present $_ .= '/' if (m!^http://([^/]+)$!i); if (m!^http://([\w|\.|\-]+)\:?(\d*)/(.*)$!i) { ($host, $port, $path, $is_valid) = (lc($1), $2, &clean_path("/$3"), 1); $port = 80 unless $port; if ($port == 80) { $clean_url = "http://$host$path"; } else { $clean_url = "http://$host:$port$path"; } } return ($clean_url, $host, $port, $path, $is_valid); } =item WebFormL Usage: &WebFormL( \%FORM ); Returns a by-reference hash of all name-value pairs submitted to the CGI script. updated: 8/21/2001 Dependencies: &url_decode &query_env =cut sub WebFormL { my ($p_hash) = @_; my @Pairs = (); if (&query_env('QUERY_STRING')) { @Pairs = split(m!\&!, &query_env('QUERY_STRING')); } else { @Pairs = @ARGV; } local $_; foreach (@Pairs) { next unless (m!^(.*?)=(.*)$!s); my ($name, $value) = (&url_decode($1), &url_decode($2)); if ($$p_hash{$name}) { $$p_hash{$name} .= ",$value"; } else { $$p_hash{$name} = $value; } } } sub url_decode { local $_ = defined($_[0]) ? $_[0] : ''; tr!+! !; s!\%([a-fA-F0-9][a-fA-F0-9])!pack('C', hex($1))!eg; return $_; } =item query_env Usage: my $remote_host = &query_env('REMOTE_HOST'); Abstraction layer for the %ENV hash. Why abstract? Here's why: 1. adds safety for -T taint checks 2. always returns '' if undef; prevent -w warnings =cut sub query_env { my ($name,$default) = @_; if (($ENV{$name}) and ($ENV{$name} =~ m!^(.*)$!s)) { return $1; } elsif (defined($default)) { return $default; } else { return ''; } } =item resolve_host Usage: my ($host,$addr) = &resolve_host($resolve_dns_names); Returns either the FQDN and IP address of the visitor, based on the variables $ENV{'REMOTE_HOST'}, $ENV{'REMOTE_ADDR'}, and $resolve_dns_names. =cut sub resolve_host { my ($resolve_dns_names) = @_; # This code converts un-resolved hostnames to their text versions, then makes # the names lowercase, and then aborts logging if this hostname is forbidden: my ($host, $addr) = (&query_env('REMOTE_HOST'), &query_env('REMOTE_ADDR')); if (($host eq '') or ($host =~ m!^\d+\.\d+\.\d+\.\d+$!)) { if (($resolve_dns_names) and ($addr =~ m!^(\d+)\.(\d+)\.(\d+)\.(\d+)$!)) { $host = (gethostbyaddr(pack('C4',$1,$2,$3,$4),2))[0]; } } $host = lc($host) || $addr; return ($host,$addr); } sub log_visit { my ($host,$addr,$ref,$to) = @_; my $logline = '|'; foreach ($host,$addr,$ref,$to,&query_env('HTTP_USER_AGENT')) { # strip delimiters: s!\||\015|\012!!sg; $logline .= $_.'|'; } foreach ((localtime(time + (3600*$TimeOffsetInHours)))[0..7]) { $logline .= $_.'|'; } $logline .= 'export|' if ($Export); $logline .= "\n"; # Make sure the record is strictly valid before writing to the log: exit unless ($logline =~ m!^\|([^\|]+)\|([^\|]+)\|([^\|]*)\|([^\|]*)\|([^\|]*)\|\d+\|\d+\|\d+\|\d+\|\d+\|\d+\|\d+\|\d+\|(export\|)?$!); if (open(LOG,">>$LogFile")) { binmode(LOG); print LOG $logline; close(LOG); } } sub he { my @out = @_; local $_; foreach (@out) { $_ = '' if (not defined($_)); s!\&!\&amp;!g; s!\>!\&gt;!g; s!\<!\&lt;!g; s!\"!\&quot;!g; } if ((wantarray) or ($#out > 0)) { return @out; } else { return $out[0]; } } sub handle_redirect { my ($nexturl) = @_; my $b_actually_log = 1; my $err = ''; Err: { # 0040 strip vertical whitespace for security $nexturl =~ s!\r|\n|\015|\012!!sg; my $redir_allow = 0; ## 0 fail with error, 1 actually redir, 2 nice cautious HTML clickthru AllowRedir: { if (($redir_md5_secret) and ($FORM{'hash'})) { eval 'use Digest::MD5 ();'; if ($@) { $err = "<p><b>Error:</b> $@.</p>"; next Err; } my $hash = &Digest::MD5::md5_hex( $redir_md5_secret . $nexturl ); if ($hash eq $FORM{'hash'}) { $redir_allow = 1; last AllowRedir; } else { $b_actually_log = 0; $err = "<p><b>Error:</b> incorrect MD5</p>"; next Err; } } ## policy? off|legacy|whitelogic|whiteonly my ($policy, $text, %allow_hosts) = &allow_redir_policy( "$LogFile.settings.pl" ); if ($policy eq 'legacy') { $redir_allow = 1; last AllowRedir; } elsif ($policy eq 'whitelogic') { # logic overrides... local paths ok, same-host ok, same-ref ok if ($nexturl =~ m!^\w+\:!) { ## external protocol } elsif ($nexturl =~ m!^\w!) { ## local link, ok $redir_allow = 1; last AllowRedir; } elsif ($nexturl =~ m!^/\w!) { ## local link, ok $redir_allow = 1; last AllowRedir; } if (($ENV{'HTTP_HOST'}) and ($nexturl =~ m!^https?://$ENV{'HTTP_HOST'}($|/|\:)!)) { ## local link, ok $redir_allow = 1; last AllowRedir; } elsif (($ENV{'HTTP_REFERER'}) and ($ENV{'HTTP_REFERER'} =~ m!^https://(\w[\w\-\.]+\w)($|/|\:)!) and ($nexturl =~ m!^https?://$1($|/|\:)!) and ($nexturl !~ m!sposed\.o!i)) { ## same host as referrer, ok $redir_allow = 1; last AllowRedir; } } elsif ($policy eq 'whiteonly') { } else { ## "off" or other invalid setting $redir_allow = 0; last AllowRedir; } ## our last hope is the whitelist... if ($nexturl =~ m!^\w*:?//(\w[\w\-\.]+\w)($|/|\:)!) { my $lc_host = lc( $1 ); if ($allow_hosts{ $lc_host }) { $redir_allow = 1; last AllowRedir; } } $redir_allow = 2; } ## end AllowRedir. if ($redir_allow == 1) { print "HTTP/1.0 301 Moved\015\012" if $::private{'PRINT_HTTP_STATUS_HEADER'}; print "Location: $nexturl\015\012\015\012"; $b_actually_log = 1; last Err; } my $he_next = &he( $nexturl ); if (($redir_allow == 2) and ($nexturl =~ m!^(/|https?://)!)) { ## cautious polite clickthru; use mock link to slow down bots... ## only deal with nice URL like /foo or http://foo... absolutely do not play ball with mailto:, skype:, javascript:.. likely to be malicious/tricky $b_actually_log = 0; $err = qq~<p>This is an external link (not whitelisted); click to visit:</p><div style="color:#00f;margin-left:60px;text-decoration:underline;cursor:pointer;" onclick="location.href=this.innerHTML">$he_next</div>~; next Err; } else { $b_actually_log = 0; $err = qq~<p>Redirects are disabled:</p><p class="margin-left:60px;">$he_next</p>~; next Err; } last Err; } continue { print "HTTP/1.0 200 OK\015\012" if $::private{'PRINT_HTTP_STATUS_HEADER'}; print $header, '<meta name="robots" content="none" />', $err; } return $b_actually_log; } =item allow_redir_policy Usage: my ($policy, $text, %allow_hosts) = &allow_redir_policy( $settings_file ); =cut sub allow_redir_policy { my ($settings_file) = @_; my $policy = 'legacy'; ## default if no file or not specific policy my $text = ''; my %lc_hosts = (); ## policy? off|legacy|whitelogic|whiteonly if (open( F, '<', $settings_file )) { binmode( F ); local $/ = undef(); $text = <F>; close( F ); } if ($text =~ m!^allow_redir=(legacy|off|whitelogic|whiteonly)!) { $policy = $1; } foreach (split( m!\n!, $text )) { next unless m!^rok=(\w[\w\-\.]*\w)\s*$!; $lc_hosts{ $1 } = 1; } return ($policy, $text, %lc_hosts); } 1; </body> </html>