In response to:
tavernadelleidee[italy]
09-Mar-2006 12:44
Usually big log files should not be processed with PHP. the default memory allocation size (usually from 8mb to 16mb) is too small to process such files. Even if you use a quite evolved script, after all you'll have to deal with timeouts generated by the clientbrowser and/or the server. It's extremely dangerous playing around with max_execution_time and other - often underestimated - memory-, time-, calculationexpensive configuration options.
If you really need to work on such big logs I strongly encourage You to use a database for storage or rotate the files (i.e. by reaching filesize of 512kb, for each user connected, ...) while they're being generated, or at process time.
PHP definitly is NOT designed to parse logfiles. The best way is still a unix shell with common tools or editors like tail, vi and often there's the only way of using those tools.
For small scripts that use database logging everyone may feel free to mail me (devel at lab minus nine dot com). You're ain't going to reinvent the wheel, are you? ;-)
fgets
(PHP 4, PHP 5)
fgets — ファイルポインタから 1 行取得する
説明
string fgets ( resource handle [, int length] )handle で指定したファイルポインタから最大 length - 1 バイト読み出し、 その文字列を返します。読み出しは、length - 1 バイト読み出したか、(返り値に含まれる) 改行文字を検出したか、EOF に達したかのいずれかが起こった時点で終了します。 length が指定されない場合は、行末に達するまで読み続けます。
エラーが起こった場合、FALSE を返します。
陥りやすい罠:
C 言語の fgets の動作に慣れている人は、EOF を返す条件の違いについて 注意する必要があります。
ファイルポインタは、有効なファイルポインタである必要があり、 fopen() または fsockopen() で正常にオープンされた (そしてまだ fclose() でクローズされていない) ファイルを指している必要があります。
簡単な例を以下に示します。
例 595. 行毎にファイルを読み込む
<?php
$handle = @fopen("/tmp/inputfile.txt", "r");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
echo $buffer;
}
fclose($handle);
}
?>
注意: length パラメータは、 PHP 4.2.0 でオプションとなりました。 PHP 4.3 より前のバージョンでは、もしこのパラメータが省略された場合、 行の長さを 1024 と仮定していました。 もしもファイル内の行の多くが 8KB を超えている場合、 行の長さの最大値を特定するためにスクリプトはリソースの影響を より大きく受けることになります。
fgetss()、 fread()、 fgetc()、 stream_get_line()、 fopen()、 popen()、 fsockopen() および stream_set_timeout() も参照ください。
fgets
17-Nov-2006 12:01
16-Nov-2006 11:57
In response to:
tavernadelleidee[italy]
09-Mar-2006 12:44
Usually big log files should not be processed with PHP. the default memory allocation size (usually from 8mb to 16mb) is too small to process such files. Even if you use a quite evolved script, after all you'll have to deal with timeouts generated by the clientbrowser and/or the server. It's extremely dangerous playing around with max_execution_time configurations on well.
If you really need to work on such big logs I strongly encourage to use a database for storage or rotate the files (i.e. by reaching filesize of 512kb) while they're being generated, or at process time.
PHP definitly is NOT designed to parse logfiles. The best way is still a unix shell with common tools or editors like tail, vi, ...
For small scripts that use database logging everyone may feel free to mail me (devel at lab minus nine dot com). You're ain't going to reinvent the wheel, are you? ;-)
13-Aug-2006 05:03
For sockets, If you dont want fgets, fgetc etc... to block if theres no data there. set socket_set_blocking(handle,false); and socket_set_blocking(handle,true); to set it back again.
15-Jul-2006 06:21
fgets is SLOW for scanning through large files. If you don't have PHP 5, use fscanf($file, "%s\n") instead.
23-May-2006 06:09
An easy way to authenticate Windows Domain users from scripts running on a non-Windows or non-Domain box - pass the submitted username and password to an IMAP service on a Windows machine.
<?php
$server = 'imapserver';
$user = 'user';
$pass = 'pass';
if (authIMAP($user, $pass, $server)) {
echo "yay";
} else {
echo "nay";
}
function authIMAP($user, $pass, $server) {
$connection = fsockopen($server, 143, $errno, $errstr, 30);
if(!$connection) return false;
$output = fgets($connection, 128); // banner
fputs($connection, "1 login $user $pass\r\n");
$output = fgets($connection, 128);
fputs($connection, "2 logout\r\n");
fclose($connection);
if (substr($output, 0, 4) == '1 OK') return true;
return false;
}
?>
Macintosh line endings mentioned in docs refer to Mac OS Classic. You don't need this setting for interoperability with unixish OS X.
09-Mar-2006 08:44
I think that the quickest way of read a (long) file with the rows in reverse order is
<?php
$myfile = 'myfile.txt';
$command = "tac $myfile > /tmp/myfilereversed.txt";
passthru($command);
$ic = 0;
$ic_max = 100; // stops after this number of rows
$handle = fopen("/tmp/myfilereversed.txt", "r");
while (!feof($handle) && ++$ic<=$ic_max) {
$buffer = fgets($handle, 4096);
echo $buffer."<br>";
}
fclose($handle);
?>
It echos the rows while it is reading the file so it is good for long files like logs.
Borgonovo
05-Jan-2006 06:20
I would have expected the same behaviour from these bits of code:-
<?php
/*This times out correctly*/
while (!feof($fp)) {
echo fgets($fp);
}
/*This times out before eof*/
while ($line=fgets($fp)) {
echo $line;
}
/*A reasonable fix is to set a long timeout*/
stream_set_timeout($fp, 180);
while ($line=fgets($fp)) {
echo $line;
}
?>
06-Dec-2005 05:17
When working with VERY large files, php tends to fall over sideways and die.
Here is a neat way to pull chunks out of a file very fast and won't stop in mid line, but rater at end of last known line. It pulled a 30+ million line 900meg file through in ~ 24 seconds.
NOTE:
$buf just hold current chunk of data to work with. If you try "$buf .=" (note 'dot' in from of '=') to append $buff, script will come to grinding crawl around 100megs of data, so work with current data then move on!
//File to be opened
$file = "huge.file";
//Open file (DON'T USE a+ pointer will be wrong!)
$fp = fopen($file, 'r');
//Read 16meg chunks
$read = 16777216;
//\n Marker
$part = 0;
while(!feof($fp)) {
$rbuf = fread($fp, $read);
for($i=$read;$i > 0 || $n == chr(10);$i--) {
$n=substr($rbuf, $i, 1);
if($n == chr(10))break;
//If we are at the end of the file, just grab the rest and stop loop
elseif(feof($fp)) {
$i = $read;
$buf = substr($rbuf, 0, $i+1);
break;
}
}
//This is the buffer we want to do stuff with, maybe thow to a function?
$buf = substr($rbuf, 0, $i+1);
//Point marker back to last \n point
$part = ftell($fp)-($read-($i+1));
fseek($fp, $part);
}
fclose($fp);
01-Dec-2005 10:51
It appears that fgets() will return FALSE on EOF (before feof has a chance to read it), so this code will throw an exception:
while (!feof($fh)) {
$line = fgets($fh);
if ($line === false) {
throw new Exception("File read error");
}
}
08-Jan-2005 04:11
Saku's example may also be used like this:
<?php
@ $pointer = fopen("$DOCUMENT_ROOT/foo.txt", "r"); // the @ suppresses errors so you have to test the pointer for existence
if ($pointer) {
while (!feof($pointer)) {
$preTEXT = fgets($pointer, 999);
// $TEXT .= $preTEXT; this is better for a string
$ATEXT[$I] = $preTEXT; // maybe better as an array
$I++;
}
fclose($pointer);
}
?>
19-Nov-2004 11:43
Sometimes the strings you want to read from a file are not separated by an end of line character. the C style getline() function solves this. Here is my version:
<?php
function getline( $fp, $delim )
{
$result = "";
while( !feof( $fp ) )
{
$tmp = fgetc( $fp );
if( $tmp == $delim )
return $result;
$result .= $tmp;
}
return $result;
}
// Example:
$fp = fopen("/path/to/file.ext", 'r');
while( !feof($fp) )
{
$str = getline($fp, '|');
// Do something with $str
}
fclose($fp);
?>
04-Nov-2004 07:54
Note that - afaik - fgets reads a line until it reaches a line feed (\\n). Carriage returns (\\r) aren't processed as line endings.
However, nl2br insterts a <br /> tag before carriage returns as well.
This is useful (but not nice - I must admit) when you want to store a more lines in one.
<?php
function write_lines($text) {
$file = fopen('data.txt', 'a');
fwrite($file, str_replace("\n", ' ', $text)."\n");
fclose($file);
}
function read_all() {
$file = fopen('data.txt', 'r');
while (!feof($file)) {
$line = fgets($file);
echo '<u>Section</u><p>nl2br'.($line).'</p>';
}
fclose($file);
}
?>
Try it.
If you need to simulate an un-buffered fgets so that stdin doesnt hang there waiting for some input (i.e. it reads only if there is data available) use this :
<?php
function fgets_u($pStdn) {
$pArr = array($pStdn);
if (false === ($num_changed_streams = stream_select($pArr, $write = NULL, $except = NULL, 0))) {
print("\$ 001 Socket Error : UNABLE TO WATCH STDIN.\n");
return FALSE;
} elseif ($num_changed_streams > 0) {
return trim(fgets($pStdn, 1024));
}
}
?>
13-Aug-2004 01:03
Take note that fgets() reads 'whole lines'. This means that if a file pointer is in the middle of the line (eg. after fscanf()), fgets() will read the following line, not the remaining part of the currnet line. You could expect it would read until the end of the current line, but it doesn't. It skips to the next full line.
17-Jun-2004 11:13
If you need to read an entire file into a string, use file_get_contents(). fgets() is most useful when you need to process the lines of a file separately.
05-Jun-2004 09:47
As a beginner I would have liked to see "how to read a file into a string for use later and not only how to directly echo the fgets() result. This is what I derived:
<?php
@ $pointer = fopen("$DOCUMENT_ROOT/foo.txt", "r"); // the @ suppresses errors so you have to test the pointer for existence
if ($pointer) {
while (!feof($pointer)) {
$preTEXT = fgets($pointer, 999);
$TEXT = $TEXT . $preTEXT;
}
fclose($pointer);
}
?>
23-Feb-2004 09:35
If you have troubles reading binary data with versions <= 4.3.2 then upgrade to 4.3.3
The binary safe implementation seems to have had bugs which were fixed in 4.3.3