• Home
  • Script library
  • AltME Archive
  • Mailing list
  • Articles Index
  • Site search
 

AltME groups: search

Help · search scripts · search articles · search mailing list

results summary

worldhits
r4wp443
r3wp4402
total:4845

results window for this page: [start: 1101 end: 1200]

world-name: r3wp

Group: !AltME ... Discussion about AltME [web-public]
Kaj:
9-Apr-2006
The file selector is horrid, but in particular, I typed the name 
of a new file, which happened to have spaces in it, but instead got 
a number of files, for each part of the name. This is very unexpected
Henrik:
26-Apr-2006
here's an idea: how about some kind of softlinks to specific places 
internally to AltME? Say I want to link to another group, a specific 
check list or a calendar day, a bug report in a bug tracker or a 
file in the filesharing tool.
Kaj:
30-Apr-2006
On AltME 1.1.28, Windows 2000, I just uploaded a new PDF. Immediately 
thereafter, the local file was shown as out of date
Kaj:
30-Apr-2006
Download failed for Syllable Inc./Marketing/De_Graaff_OSS_adoptiegedrag_van_ondernemers_webversie.pdf 
because write-failed - Is the file open or in use?
Kaj:
30-Apr-2006
This keeps happening at the end of the download, so I can't even 
access the file from AltME
Thør:
17-Jun-2006
Tried synching to the RebGUI discussion twice, but got disconnected 
at more or less the same percentage. Is it possible to have AltME 
download the "chat files" on a per "number of messages"  basis, instead 
of downloading the entire chat file in one session? That way people 
having synching problems like myself could at least download the 
messages from the discussion group a little at a time until the entire 
"chat file" is downloaded, rather than be frustrated when we get 
disconnected at 99% and retry synching again. It will also lessen 
the annoyance of other people seeing several dots made by the same 
poster.

Just my 2 cents.

Cheers!
james_nak:
20-Jul-2006
Freaky day today. On my work's World my account became "disconnected" 
and when I tried to revisit the world it would not recognize me. 
I checked out the user file and Master was back on. Fortunately I 
had a backup so I replaced Master with my account. Kinda scary.
Maarten:
21-Jul-2006
An X-file episode.... But Reichart a) doesn't smoke b) isn't an alien. 
Although the latter may be an attempt of another alien to protect 
our species ;-)
Henrik:
3-Sep-2006
playing around with file sharing here... is it wise to just allow 
to run DOS batch files without permission? when I upload a .bat file 
and click on it from another host, it immediately runs.
Anton:
4-Sep-2006
I meant, does it run on the machine from which you clicked, or does 
it run on the server where the .bat file lives. (after thought I 
suppose it runs on the machine from which you clicked.)
Henrik:
5-Sep-2006
anton:


1. on Machine A, an XP box, I create a bat file and add it to my 
file sharing list on my world

2. on Machine B, also an XP box, I open AltME on that same world 
with the same user account and notice that a new file is ready for 
download.
3. on Machine B, I click it and it downloads.

4. on Machine B, I click it again, and it immediately opens a DOS 
shell window and runs.
Pekr:
5-Sep-2006
maybe I just don't understand the issue - what level do you mean? 
You downloaded .bat file, which is just that, .bat file. It now depends, 
what you can do under your XP box (user permissions)
Anton:
5-Sep-2006
Oh I see what you mean - single-click from AltME executes the .bat 
file.
Louis:
12-Oct-2006
Why is file share disabled in this world?
Louis:
12-Oct-2006
Any plan to turn file sharing on?
Henrik:
16-Oct-2006
Only the last test above was done in Windows XP. The others were 
done in OSX.


When I paste from TextEdit in MacOSX into a VID textarea, ^M is pasted, 
and it doesn't work right. When I paste from Notepad in Windows into 
a VID textarea, ^/ is pasted and it works correctly. So the results 
are different, yes.


When I copy and paste from a VID textarea to TextEdit, it looks alright, 
but this might be due to how TextEdit handles different line ending 
chars the same. I can't see the line endings in TextEdit. When I 
save the TextEdit file to disk in Windows Latin 1 format and read 
it in REBOL, the line endings are correct.


When I copy and paste between two VID text areas in OSX, it looks 
alright.


OSX Rebol/View has also another bug in which, when using cursor up 
or down in a text area, the cursor also moves one char to the left.
Group: Rebol School ... Rebol School [web-public]
denismx:
27-May-2007
Project 1: Read a web page given a URL, find some data in the page, 
append it to a file on disk. Read the given disk file et show the 
data on screen.
PatrickP61:
25-Jun-2007
Ahhh so much to learn and not enough time!!!  Thanks for your patience
Ok, on to another issue.


I have a text file as a printable report that contains several pages 
within it.  Each line can be as large as 132 columns wide (or less).

- The literal  "    Page " will begin in column 115 and that indicates 
the start of a printed page.


I want to write a script that will read this text file one-page-at-a-time, 
so I can do some processing on the page.


How do I write a script to load in a single "page"?  I am guessing 
that I need to open a PORT and have rebol read all the lines until 
I get "....Page." in bype position 115.

Any suggestions?
Geomol:
25-Jun-2007
I think, that'll fail, if the file is empty!
PatrickP61:
26-Jun-2007
Hi everyone,


I want to write out a Ruler line to a text file for a specified length 
of bytes similar to the following for 125 bytes in length:

----+---10----+---20----+---30----+---40----+---50----+---60----+---70----+---80----+---90----+--100----+--110----+--120----+

I tried the following code, but not what I want:             Ruler:	for 
Count 10 125 10 [ prin "----+---" Count ]     I got this instead:

----+-------+-------+-------+-------+-------+-------+-------+-------+-------+-------+-------+---Ruler: 
120              Any suggestions?
PatrickP61:
27-Jun-2007
Hi All,  
Have any Rebolers dealt with UniCode files?


Here is my situation.  I work on an IBM AS400 that can "port" files 
over to the PC.  Notebook opens it up just fine, but Rebol doesn't 
see it the same way.  If I Cut & Paste the contents of the file into 
an empty notebook and save it, Rebol can see it just fine.  Upon 
further study, I noticed at the bottom of the SAVE AS window that 
Encoding was set to UNICODE for the AS400 file, while the cut & paste 
one was set to ANSI.  


Does Rebol want ANSI text files only, or can it read UNICODE files 
too?
PatrickP61:
27-Jun-2007
When you try to save a document under Notebook, the encoding choices 
are UTF-8, UNICODE, ANSI among others.  UNICODE may be the same as 
UTF-16 because it does look like every single character is saved 
as two bytes.  


The code (rejoin extract read InFile 2) does eliminate the double 
characters but I noticed that the entire file is still double spaced 
-- as if the newline is coded twice and not removed from the rejoin. 
 But that extra newline may be an annoyance than anything else.
PhilB:
28-Jun-2007
Patrick ... on your AS400 problem .... how is the data transferred 
to the PC?  Is it directly from an AS400 file via the data transfer 
utility built into, or is it a file from the IFS ?

(I have used Rebol to read data transferred from an AS400 and didnt 
get the data as unicode.)
PatrickP61:
28-Jun-2007
Hi Anton -- This is my simulated input for a unicode text file:
	Line1...10....+...20....+...30....+...40....+...50
	Line2...10....+...20....+...30....+...40....+...50
If I run this code:
	InFile:		%"Small In unicode.txt"

 InText:	rejoin extract read InFile 2	; Convert from UNICODE to ANSI 
 but keeps double spacing.
	OutFile:	%"Test Out.txt"
	write OutFile InText
	print InText	
I get these results
	ÿLine1...10....+...20....+...30....+...40....+...50
	
	Line2...10....+...20....+...30....+...40....+...50
	

I get them in the output file when I use the Rebol editor, and in 
notebook (when I open the file) and I get them in console when PRINT 
InText.
PatrickP61:
28-Jun-2007
At first, I thought it just be some stray bytes comming from the 
AS400, but I was able to re-create a file using Notebook and get 
same results.
Any of you should be able to test this out by:
1.  Open Notebook
2.  Type in some text
3.  Save the file with Encoding to UNICODE
PatrickP61:
28-Jun-2007
Gregg -- I dont know how to reveal the binary/ascii values of the 
file, but the spanish y looks like it may be hex FF.  Do you have 
rebol code that can convert the characters into hex?
Sunanda:
28-Jun-2007
FFFE is a "byte order mark" -- something that has been slipped in 
at the beginning of the file to indicate the file is in UTF-16, little 
endian format....If it started FEFF you'd have to extract all the 
other bytes. 

Looks like the original file (or whatever did the EBCDIC to UTF-16 
conversion on the AS400)  is using A0A0 to mean newline. You may 
need to clean those up by hand:
PatrickP61:
5-Jul-2007
Situation:	I want to read in an input file and parse it for some 
strings

Current:	My test code will do the parsing correctly IF the input 
block contains each line as a string

Problem:	When I try to run my code against the test file, It treats 
the contents of the file as a single string.

Question:	How do I have Rebol read in a file as one string per line 
instead of one string?
In-text:	[	"Line 1                        Page     1"
		"Line 2    Name      String-2"          
		"Line 4    Member    String-3 on 12/23/03"
		"Line 5    SEQNBR    abcdef               "                
		"Line 6       600    Desc 1 text         12/23/03"
		"Line 7      5400    Desc 2    Page 4    12/23/03"
		"Line 8    Number of records searched	]
 Get-page:		[thru "     Page "	copy Page-id to end]
 Get-file:		[thru "Name  "		copy Name-id to end]
 Get-member:	[thru "Member  "	copy Member-id to end]

 Page-id:	Name-id:	Member-id:	"-"

 for N 1 length? In-text 1 [
	parse In-text/:N	Get-page
	parse In-text/:N	Get-file
	parse In-text/:N	Get-member
	] 
 print	[ "Page"	Page-id		]
 print	[ "Name"	Name-id		]
 print	[ "Member"	Member-id	]
Sunanda:
5-Jul-2007
Try
  in-text: read/lines %file-name
PatrickP61:
5-Jul-2007
In my example above, I have three parse rules defined.  I need to 
add several more.

Does the PARSE process the string once per rule?  i.e. Does it scan 
the string for Get-page, then Get-file, then Get-member (scan the 
string 3 times),  Or can I structure the pase rules together to process 
against the string once?
Tomc:
5-Jul-2007
if your page,name & member always exist and are in that order ...

parse/all read %file [
 some [ 
	thru "Page " copy token integer! (print ["Page" token]) 
	thru "Name " copy token to newline(print ["Name" token])
	thru "Member " copy token to newline (print ["Member" token])
	]
]
PatrickP61:
5-Jul-2007
Tomc  -- This version means that I need to have the entire file read 
in as a string -- Not with Read/Lines -- Because the newline will 
the the "delimiter" within the string while the Read/Lines will delimit 
each newline to a separate string inside a block.  Do I have that 
right?
PatrickP61:
5-Jul-2007
My Page, Name, & Member is always in the same order on separate pages 
within a file.  like so:
Line 1     Page 1
Line 2     Name
Line 3     Member
Line n...  Member
Line 50  Member
Line 51  Page   2
Line 52  Name
Line 53  Member
Line 54  Member
...
PatrickP61:
6-Jul-2007
Thank you Sunanda -- I will give that a try.


Just to let you know -- My goal is to convert a printable report 
that is in a file into a spreadsheet.
Some fields will only appear once per page like PAGE.

Some fields could appear in a new section of the page multiple times 
like NAME in my example.
And some fields could appear many times per section like MEMBER:
_______________________
Page header          PAGE     1
Section header     NAME1.1
Detail lines            MEMBER1.1.1
Detail lines            MEMBER1.1.2
Section header     NAME1.2
Detail lines            MEMBER1.2.1
Detail lines            MEMBER1.2.2
Page header         PAGE    2
(repeat of above)____________


I want to create a spreadsheet that takes different capturable fields 
and place them on the same line as the detail lines like so...
______________________
Page   Name       Member
1          NAME1.1  MEMBER1.1.1
1          NAME1.1  MEMBER1.1.2
1          NAME1.2  MEMBER1.2.1
1          NAME1.2  MEMBER1.2.2

2          NAME2.1  MEMBER2.1.1  ...    (the version numbers are 
simply a way to relay which captured field I am referring to (Page, 
Name, Member)


Anyway -- that is my goal.  I have figured out how to do the looping, 
and can identify the record types, but you are right about the possiblity 
of mis-identifying lines.
PatrickP61:
6-Jul-2007
This is my pseudocode approach:


New page is identified by a page header text that is the same on 
each page and the word PAGE at the end of the line

New section is identified by a section header text that is the same 
within the page and the text "NAME . . . . :"

Members lines do not have an identifying mark on the line but are 
always preceeded by the NAME line.

Member line continue until a new page is found, or the words "END 
OF NAME" is found (which I didnt show in my example above).


Initialize capture fields to -null-     like PAGE, NAME
Initialize OUTPUT-FLAG to OFF.

Loop through each line of the input file until end of file EOF.
/|\	If at a New-page line
 |	or at end of Name section
 |		Set OUTPUT-FLAG  OFF
 |	If OUTPUT-FLAG  ON

 |		Format output record from captured fields and current line (MEMBER)
 |		Write output record
 |	IF at New Name line
 |		Set OUTPUT-FLAG ON
 |	IF OUTPUT-FLAG OFF
 |		Get capture fields like PAGE-NUMBER when at a PAGE line
 |		Get NAME when at a NAME line.
 |____	Next line in the file.
Tomc:
7-Jul-2007
Yes Patrick you have it right. The rules I gave would fail 
since you have multiple names/members

I would try to get away from the line by line mentality 
and try to break it into your conceptual record groupings
file, pages, sections, and details...

One trick I use is to replace a string delimiter for a record 
with a single char so parse returns a block of that record type. 

this is good because then when you work on each item in the block 
in turn
you know any fields you find do belong to this record and that 

you have not accidently skipped to a similar field in a later record.

something like this 


pages: read %file
replace/all/case pages "PAGE" "^L"
pages: parse/all pages "^L"

foreach page pages[
	p: first page
	page: find page newline
	replace/all/case page "NAME" "^L"
	sections: parse page "^L"
	foreach sec section [
		s: first section
		sec: find sec newline
		parse sec [
			any [thru "Member" copy detail to newline 
				newline (print [p tab s tab detail])
			]
		]
	]
]
PatrickP61:
18-Jul-2007
In-port:	open/lines In-file
 while [not tail? In-port] [
	print In-port
	In-port:	next In-port
	]

 close In-port
PatrickP61:
18-Jul-2007
This is not doing what I want.  

I want it to continue to run through all  lines of a file and print 
it
PatrickP61:
18-Jul-2007
My goal is to be able to control how much of a file is loaded into 
a block then process the block and then go after the next set of 
data.  That is why I am using PORT to do this function instead of 
reading everything into memory etc.
Vladimir:
26-Oct-2007
What could be problem with this script?

set-net [[user-:-mail-:-com] smtp.mail.com pop3.mail.com] 
today: now/date
view center-face layout [
		size 340x120
 		button "Send mail" font [size: 26] 300x80	[

    send/attach/subject [user-:-mail-:-com] "" %"/c/file.xls" reduce [join 
    "Today  " :danas]
 			quit
 		]
	]

I get this error:


** User Error: Server error: tcp 554 5.7.1 <[user-:-mail-:-com]>: Relay 
access denied
** Near: insert smtp-port reduce [from reduce [addr] message]

Could it be some security issue?

It worked with previous internet provider... A week ago we changed 
it and now this happens...

Should I contact my provider to change some security settings or 
should I change something in the script?
Vladimir:
29-Oct-2007
How complicated is it to make simple file transfer between two computers 
(one client and one server) in rebol?
What protocol should I use?
Any ideas?

Currently I'm sending e-mails from clients with file-atachments because 
its the simplest way of collecting data from clients. (so far they 
were doing it by hand -  so this is a big improvement :)
Gregg:
29-Oct-2007
There are a lot of ways you could do it, FTP, LNS, AltMe file sharing, 
custom protocol on TCP. It shouldn't be hard, but I would try to 
use something existing. The devil is in the details.
Graham:
29-Oct-2007
I've done file transfer using async Rebol rpc, and also using Uniserve
Graham:
29-Oct-2007
I think Carl posted some code on how to do huge file transfers.
james_nak:
30-Oct-2007
Here is something from the rebservices section from Gabriele:

client has experimental generic http proxy support; server has the 
new, much improved file service. see http://www.rebol.net/rs/demos/file-client.r
for example usage to transfer big files.
Vladimir:
30-Oct-2007
Files are 1-2 Mb. Ziped archives of dbf files.

As I said now I'm using small rebol script to send file as attachment, 
human on the other side is downloading them and unpacking them, and 
its working.

I planed to make a "server" side script that would download newly 
arrived attachments and unpack them in designated folders, but then 
I thought about trying some real client-server approch...

Then again, server would have to be started at the time of transfer. 
I have to know ip adresses and to make them public (hamachi jumps 
in as help)...

E-mail used as buffer for data is not bad... And it works... But 
I have to check max mailbox size .... What if workers execute sending 
script more then ones?

There is one strange thing with sending big (>1 Mb) files::

On win98 it goes without any problem. On XP at the end of transfer 
rebol returns an error message about network timeout, but the file 
is sent and all is ok..

Thanks guys... Lot of info... Will check it out and send my experiences.
Ingo:
30-Oct-2007
A script to send files over the network using tcp. It once started 
with 2 3-liners, 

http://www.rebol.org/cgi-bin/cgiwrap/rebol/view-script.r?script=remote-file.r
Gabriele:
30-Oct-2007
we're using rebol/services to transfer backups from www.rebol.net 
to mail.rebol.net. files are a couple hundred MB. see http://www.rebol.net/rs/demos/file-client.r
Vladimir:
28-Jan-2008
How can i convert content of dbf file to readable html file on webserver 
?

I thought to use rebol to make conversion, and then transfer html 
to server using ftp...
Can someone point me in the right direction ?
Gregg:
28-Jan-2008
One thing REBOL isn't particularly good at is structured file I/O, 
which is what you need for DBF. Perfect job for a dialect though. 
:-)
Oldes:
28-Jan-2008
what do you mean with "structured file I/O"?
BrianH:
28-Jan-2008
File I/O with binary file structures needs better conversion facilities 
than REBOL has, at least to do efficiently.
Vladimir:
29-Jan-2008
There is only one file that needs to be put online.....
Its 13 Mb big....
Anybody has some advice how to parse it in binary mode?
Oldes:
29-Jan-2008
Here is for example script for parsing AVI file using the %stream-io.r 
script: http://box.lebeda.ws/~hmm/rebol/avi_latest.rbut I use it 
to parse other formats as well
Oldes:
29-Jan-2008
I don't have any DBF file, but looking at spec, it should not be 
difficult to read it.
btiffin:
13-Feb-2008
Compressed at http://www.rebol.net/rs/server.rand client.r  with 
the standard  save ... ctx-services thingy.

I'm pretty sure this is Gabriele's later release that has support 
for large file transfers.

I would say yes, LNS is still a good plan.  It's in use with DevBase.
Vladimir:
28-Mar-2008
I want to upload file on ftp. 
I know I can do it like this:
write/binary ftp://user:[pass-:-website-:-com] read/binary %file

Or I am supossed to do it.... it just wont let me....


>>  write/binary ftp://user:[pass-:-ftp-:-site-:-com] read/binary %file.txt
** User Error: Cannot open a dir port in direct mode

** Near: write/binary ftp://user:[pass-:-ftp-:-site-:-com] read/binary %file.txt

I can read the contents of ftp rootdir with:
print read ftp://ftp.site.com/


But writing is not working.... What does it mean: "Cannot open a 
dir port in direct mode"?
BrianH:
28-Mar-2008
Try this:  write/binary ftp://user:[pass-:-ftp-:-site-:-com]/file.txt read/binary 
%file.txt
Yoiu were trying to write to a directory, not a file.
Vladimir:
29-Oct-2008
Ftp problem again.... everything was working perfectly from march 
till now.... we changed internet provider yesterday and my scripts 
dont work any more :(
Here is the core of script where error is:

server: ftp://user:[pass-:-ftp-:-firm-:-com]/data
file: %file.zip
from-port: open/binary/direct lokal/:file
to-port: open/binary/new/direct server/:file

** Access Error: Network timeout
** Where: confirm
** Near: to-port: open/binary/new/direct server/:file
Vladimir:
30-Oct-2008
Trace/net  did gave more info:

Its not my new internet provider... :)  its our old router used in 
a new way... :)
This is the message I get...
Username... pass.... OK.... and then:
Net-log: "Opening listen port 2655"
Net-log: [["PORT" port/locals/active-check] "200"]
Net-log: "200 PORT command successful"
Net-log: [["CWD" port/path] ["25" "200"]]
Net-log: "250 OK. Current directory is /apl"
Net-log: "Type: new"
Net-log: ["TYPE I" "200"]
Net-log: "200 TYPE is now 8-bit binary"
Net-log: [["STOR" port/target] ["150" "125"]]
Net-log: "Closing cmd port 2652 21"
Net-log: "Closing listen port 2655"
** Access Error: Network timeout
** Where: confirm
** Near: to-port: open/binary/new/direct server/:file
Vladimir:
30-Oct-2008
user name is "visaprom.com"
Total commander is set up simply...
use passive is ON...
here is what is happening with ftp/passive: true:

Net-log: "215 UNIX Type: L8"
Net-log: ["PWD" "25"]
Net-log: {257 "/" is your current location}
Net-log: ["PASV" "227"]
Net-log: "227 Entering Passive Mode (194,9,94,127,235,183)"
Net-log: [["CWD" port/path] ["25" "200"]]

this is where it pauses and then:

Net-log: "Closing cmd port 3565 21"
** Access Error: Network timeout
** Where: confirm
** Near: to-port: open/binary/new/direct server/:file
Vladimir:
30-Oct-2008
to-port: open/binary/new/direct server/:file

URL Parse: visaprom.com password ftp.visaprom.com none apl/ ik104.zip
Net-log: ["Opening" "tcp" "for" "FTP"]
Net-log: [none ["220" "230"]]

Net-log: {220---------- Welcome to Pure-FTPd [privsep] [TLS] ----------}
Net-log: "220-You are user number 188 of 400 allowed."
Net-log: "220-Local time is now 11:33. Server port: 21."
Net-log: "220-This is a private system - No anonymous login"
Net-log: {220-IPv6 connections are also welcome on this server.}

Net-log: {220 You will be disconnected after 15 minutes of inactivity.}
Net-log: [["USER" port/user] "331"]
Net-log: "331 User visaprom.com OK. Password required"
Net-log: [["PASS" port/pass] "230"]
Net-log: {230-User visaprom.com has group access to:  www     }
Net-log: "230 OK. Current restricted directory is /"
Net-log: ["SYST" "*"]
Net-log: "215 UNIX Type: L8"
Net-log: ["PWD" "25"]
Net-log: {257 "/" is your current location}
Net-log: ["PASV" "227"]
Net-log: "227 Entering Passive Mode (194,9,94,127,216,138)"
Net-log: [["CWD" port/path] ["25" "200"]]
Vladimir:
31-Oct-2008
There you go:

http://www.2shared.com/file/4192455/b9f6ca7d/LB_Manual_1_1.html

Sugestions are welcome :)
Vladimir:
4-Nov-2008
here is command with error:
to-port: open/binary/new/direct server/:file
** Access Error: Network timeout
** Where: confirm
** Near: to-port: open/binary/new/direct server/:file
DideC:
4-Nov-2008
It seems the problem is after the PORT command.

It  define the port used to receive or send the file data (depending 
the command you issue).

Use Wireshark to have a look to what Total commander do regarding 
its PORT command.
So we can compare with the Rebol commands.


I guess the router firewall block the one Rebol use, but Total commander 
do it in an over way.
Vladimir:
5-Nov-2008
No.     Time        Source                Destination           Protocol 
Info

     90 2.750586    192.168.2.108         194.9.94.127          FTP   
        Request: TYPE I

     97 2.823074    194.9.94.127          192.168.2.108         FTP   
        Response: 200 TYPE is now 8-bit binary

     98 2.828500    192.168.2.108         194.9.94.127          FTP   
        Request: PASV

    113 3.171841    192.168.2.108         194.9.94.127          FTP  
        [TCP Retransmission] Request: PASV

    114 3.244193    194.9.94.127          192.168.2.108         TCP  
        [TCP Previous segment lost] ftp > mgemanagement [ACK] Seq=80 
    Ack=15 Win=16500 Len=0

    131 3.889034    194.9.94.127          192.168.2.108         FTP  
        [TCP Retransmission] Response: 227 Entering Passive Mode (194,9,94,127,250,69)

    137 3.984887    192.168.2.108         194.9.94.127          FTP  
        Request: STOR ik104test.zip

    149 4.247163    194.9.94.127          192.168.2.108         TCP  
        ftp > mgemanagement [ACK] Seq=80 Ack=35 Win=16500 Len=0

    210 7.046287    194.9.94.127          192.168.2.108         FTP  
        Response: 150 Accepted data connection

    241 7.218716    192.168.2.108         194.9.94.127          TCP  
        mgemanagement > ftp [ACK] Seq=35 Ack=110 Win=16269 Len=0

   1613 17.145048   194.9.94.127          192.168.2.108         FTP 
        Response: 226-File successfully transferred

   1617 17.172970   192.168.2.108         194.9.94.127          FTP 
        Request: SIZE ik104test.zip

   1620 17.277591   194.9.94.127          192.168.2.108         FTP 
        Response: 213 566605

   1623 17.375906   192.168.2.108         194.9.94.127          FTP 
        Request: TYPE A

   1628 17.498619   194.9.94.127          192.168.2.108         FTP 
        Response: 200 TYPE is now ASCII

   1629 17.516657   192.168.2.108         194.9.94.127          FTP 
        Request: PASV

   1633 17.644044   194.9.94.127          192.168.2.108         FTP 
        Response: 227 Entering Passive Mode (194,9,94,127,205,237)

   1637 17.750889   192.168.2.108         194.9.94.127          FTP 
        Request: LIST

   1643 17.835367   194.9.94.127          192.168.2.108         FTP 
        Response: 150 Accepted data connection

   1644 17.863490   194.9.94.127          192.168.2.108         FTP 
        Response: 226-Options: -a -l 

   1645 17.863548   192.168.2.108         194.9.94.127          TCP 
        mgemanagement > ftp [ACK] Seq=75 Ack=364 Win=16015 Len=0
Vladimir:
5-Nov-2008
I uploaded manual for my router before:
http://www.2shared.com/file/4192455/b9f6ca7d/LB_Manual_1_1.html

Can someone tell me where are these settings for "connection tracking" 
?
Group: rebcode ... Rebcode discussion [web-public]
Pekr:
24-Oct-2005
guys, how well does parse play with rebcode? It was said that parse 
is VM in itself, it is very right, but now we have some discussion 
about zlib support. Let's suppose we have rebol version on rebol.org 
and that we would like to speed it up. We can simple extend the idea 
to any other datatype (= in amiga terms, simply a file format, or 
protocol one). you will surely want to use parse. The question is, 
if you can speed up some things using rebcode?
Group: Tech News ... Interesting technology [web-public]
Terry:
14-May-2006
Jaime, just had a look at 'migrations' and it's not the same at all.. 
 here's the pseudo code just to change the db with rails..

    * Step 1: Create a migration with script/generate migration WhatImChanging
    * Step 2: Modify your generated migration file in db/migrate
    * Step 3: Run rake migrate

    * Step 4: Revel in the fact that your database is converted to the 
    newest schema!


With Framewerks you never alter the DB.. it's a black box where data 
goes in and out.
Terry:
14-May-2006
From "usining Ruby on  Rails for Web Development" article

<quote>

If you try to submit the form, Rails complains that it can't find 
the record action to handle the form post. We need to define that 
action in the ExpensesController. Add the following action method 
to the app/controllers/expenses_controller.rb file:

def record
 
 Account.find(params[:id]).expenses.create(params[:expense])
  redirect_to 
:action => 'show', :id => params[:id]
end

</quote>
Group: SQLite ... C library embeddable DB [web-public].
Pekr:
1-Aug-2006
I have one suggestion. Trying to use sqlite for cgi, I have following 
dir structure:

\app
app.cgi

\app\system (sqlite.r, sqlite.dll, other app related "system" files)
\app\data (*.db)


I don't like sqlite driver putting .log file into caller directory 
= main app directory. Not sure where it belongs, if in \system, \data, 
or simply \log subdir, but the driver has no ability to set the path 
...
Pekr:
1-Aug-2006
I did following modifications to driver:

log-path: to-file copy ""

then replace/all "%sqlite.log" "join log-path %sqlite.log"


then in my cgi script I am able to do sqlite/log-path: %db/ to change 
location ...... maybe it would be usefull to even set db path and 
don't bother with paths, not sure ....
Pekr:
1-Aug-2006
I don't understand the line: unless find first database %/ [insert 
first database what-dir], as it just changes path to first file, 
is that ok?
Ashley:
1-Aug-2006
sqlite open command expects a fully qualified local file name ... 
the line in question prepends the supplied file name with current 
dir unless the file name is in fact a path.
Ashley:
1-Aug-2006
The log-path issue is best resolved by adding a log-file word to 
the sqlite context that defaults to %sqlite.log. You can then do 
the following in your code:

	sqlite/log-file: %my-path/my-log-file.log
Ashley:
2-Aug-2006
From the User Guide: "Every connect, disconnect, error and statement 
retry is logged to %sqlite.log. This refinement adds SQL statements 
as well. While this can be useful to monitor what SQL statements 
are being issued and what the volume and distribution is; be sure 
to monitor the size of this file in high transaction environments."


If you really don't want any log output then just direct it to /dev/null
Pekr:
18-Sep-2006
Hi, I know that some talk of encryption was held here some time ago, 
but currently I was asked to eventually protect sqlite data and I 
am not sure what is correct aproach. I would not go DB-as-a-file 
encryption, then "unpacking" into memory, or so. I prefer app level 
encryption, but I am not sure about searches, using LIKE directive. 
Would it work?
Robert:
6-Nov-2006
I have a problem when I import a CSV file. I read the file (1.5 -2 
MB), parse it and than write it out to SQLite.


For some records I get scrambeld (invalid strings) in the database. 
Importing the CSV file via other SQLite tools work without any problem.


It looks like the Rebol memory somehow gets messed up. Or could it 
be on the way to the DLL?
Pekr:
7-Nov-2006
there is one thing I really don't like about sqlite - it stores everything 
into one file. I want one file for table, one file for index, as 
with mysql, because for me it means simplicity - I can just look 
into file system and see how big some table is, or selectively backup 
some tables .... mySQL works that way IIRC
Louis:
9-Nov-2006
Thanks, Ashley.  But that website has a flaw; it can't be downloaded 
easily. It should be either one html page or else one pdf file.
Henrik:
9-Nov-2006
wget is also very good at resuming downloads:

wget -c <very incredibly big file>
Robert:
26-Nov-2006
I have created a semicolon seperated file and imported it via the 
SQLite command line tool. All numbers where just plain included, 
not guarded by " chars.
Pekr:
14-Dec-2006
I am trying to analyse few sendmail logs. Our admin sent me three 
files. The first one, has those small boxes instead of newlines, 
you know it, when you try to open linux file under windows
Pekr:
14-Dec-2006
If I import one file at a time, clear the block, then data is OK 
in sqlite, but if I append first to one block, then insert into sql, 
data is corrupted on few random places ...
Pekr:
14-Dec-2006
following works:

;--- import log files
import-logs: does [

   ;--- pairs of incident No and incident file [[1 filename][2 filename] 
   atd.] ...

   log-files: [[1  06-12-06-7_50-7_59][2  06-12-12-15_46-15_47][3 
    06-12-13-15_29-15_31]]
   foreach file log-files [
     log-info: copy []
     log-file: read/lines file/2  
     ;print length? log-file
     foreach string-line log-file [
       line: parse string-line " "
       if line/7 == "GET" [ 

         append log-info reduce [line/1 line/2 line/4 line/8 line/11 to-string 
         file/1]
       ] 
     ] 

    SQL "BEGIN"

    foreach [date time ip-address url content-type incident-id] log-info 
    [

       SQL reduce ["INSERT INTO logs VALUES (?, ?, ?, ?, ?, ?)" date time 
       ip-address url content-type incident-id]
    ]
    SQL "COMMIT"

   ]
]
Pekr:
14-Dec-2006
If I put SQL section out of the foreach file log-files, simply appending 
all logs at once, data is corrupted ... it is reproducable ....
Pekr:
14-Dec-2006
converting the first file (reading and saving) did not help either 
... my suspicion is, there is some bug with driver ...
Pekr:
29-Nov-2007
... and they definitely should completly remove their claim that 
1 file for db is an advantage. That is the most serious obstacle 
of sqlite ... simplicity comes via ability to easily backup ... one 
file per index, table ...
Ingo:
30-Nov-2007
Well, simplicity lies in the eyes of the beholder ... just having 
to back up a single file seems pretty easy to me ...
Pekr:
13-Dec-2007
RebDB is also mostly - in memory only database. It does not live 
on hd. So - SQLite has one advantage here - it supports locking over 
file-shared SQLite database.
Ashley:
19-Dec-2007
Why not? It's ACID compliant and SQLite on a server where all file 
ops are local to the DB process seems OK to me.
Louis:
8-Sep-2008
rebview cl-sqlite.r


Any idea why I'm getting this (using Ubuntu and Ashley's sqlite.r): 
Script Error: Library error: libsqlite3.so: cannot open shared object 
file: No such file or directory
** Near: *lib: load/library switch/default fourth system/version
>>
sqlab:
19-Sep-2008
What is the recommendet way to deal with a lock and to get rid of 
dat-journal?

I do not like to stop all processes and to manually remove the file.
Also to increase the retries has its limitations.
So, how do you handle the problem?
Maarten:
15-Oct-2008
I think you should try parse on a large file with /seek, just to 
test. Or load it in memory upfront, so you hav the cost once.
Robert:
3-Dec-2008
Ashley, I just remembered that you can't call CONNECT/CREATE several 
times in one application. It gives the error "Database already connected" 
even if you use different file names.


To open more than one database file you have to use the sql ATTACH 
command starting from the 2nd database file.
Robert:
3-Dec-2008
Ok, some more findings. I think the best way is to make a copy of 
the SQLite object for each database file. Than things are independent. 
The only thing to solve is to find an elegant way to select which 
SQLite object/connection to use without having to pre-fix all calls.
Pekr:
4-Dec-2008
well, but at some point, you open-up that partition in order to be 
able to access it. The security is not there anymore. What I would 
like to have is direct SQLite low-level encryption, so that file 
might be visible to FS, but still encrypted. And your app provides 
password or something like that ... IIRC BrianH is using some such 
solution, I just don't remember its name.
Ashley:
4-Dec-2008
Robert, I was thinking we can depreciate the /create refinement by 
making that implict as well ... and the change required to support 
additional CONNECTs after the first should be as simple as changing 
the line that reads:

	all [dbid sql-error "Already connected"]

to something like:

	if all [dbid file? database]  [
		unless find file %/ [insert file what-dir]
		sql rejoin ["attach '" ...
		return
	]


which then raises the interesting question as to whether we should 
force database to be file! (so you'd have to attach multiple databases 
by issuing multiple CONNECTs ... it would certainly simply the CONNECT 
logic! ;)
Robert:
4-Dec-2008
And than the returned DB handle has to be used for all actions against 
this database file. It's much like a file handler.
Robert:
4-Jan-2009
FYI: I'm currently adding some stuff to Ashley's SQLite driver to:

1st: Handle in memory databases (":memory:")
2nd: To handle connection to more than one database file at once.

So, if someone did this already pleasae let me know :-)
1101 / 484512345...1011[12] 1314...4546474849