r3wp [groups: 83 posts: 189283]
  • Home
  • Script library
  • AltME Archive
  • Mailing list
  • Articles Index
  • Site search
 

World: r3wp

[!REBOL3-OLD1]

Maxim
14-Dec-2009
[20184x2]
My only problem with R3 right now is that there is no codec for text 
reading .  This means I can't properly import C files for example, 
unless I convert them to utf-8 with something else first.


Has anyone done (or started to work on) a simple character mapping 
engine?
the new bitset! datatype is absolutely fantastic!  we can actually 
USE it now  ;-)

before it was basically just a parse accelerator
Pekr
15-Dec-2009
[20186x2]
guys, to make things clear for me - recent efforts - if someone reports, 
that R3 compiles on system XY, does it mean that we are able to build 
R3 on such a platform = it is build and usable on such a platform? 
Or are we reporting that Carl is able to compile just DLL on a target 
platform?
Because if it is a former, it means that in 3 weeks we ported actual 
REBOL version (even if pre-beta) to more platforms, than R2 got to 
in last 10 years :-)
Henrik
15-Dec-2009
[20188]
I think we need a simple document or FAQ that answers what it takes 
to port R3 to a completely new CPU/OS platform. Just 10-15 bulletpoints.
Pekr
15-Dec-2009
[20189]
there will be of-course difference in porting Core, vs. for .e.g 
View ...
Maxim
15-Dec-2009
[20190]
pekr, once carl can build a dynamic library on a platform, someone 
can do the rest, using the common host code.  as most alternative 
OSes support some level of unix C libraries, the adaptation of the 
host code should not be a big issue.  since the code is based on 
ANSI 90, just about every platform out there supports that.


and yes, on every platform that Carl has compiled the host in the 
last weeks, this means R3 runs and is usable on that platform.
btiffin
15-Dec-2009
[20191]
HOST KIT FOR LINUX!   And I'm three days behind already.  Why didn't 
anyone send a memo to read the blog?   :)


Woohoo, back to REBOL programming...     And just so ya know, when 
I embed this in OC it'll be CORE BOL  ;)
Maxim
15-Dec-2009
[20192]
:-D
Maxim
16-Dec-2009
[20193]
what's the best way to convert a hex string to a decimal value in 
R3?
Sunanda
16-Dec-2009
[20194]
One way is to start with an issue! rather than hex
    >> to-integer #100
    == 256
Maxim
16-Dec-2009
[20195]
perfect... just what I needed  thanks  :-)
PeterWood
16-Dec-2009
[20196]
I got some strange results with decima!
Sunanda
16-Dec-2009
[20197]
But check it works for large values first.....
PeterWood
16-Dec-2009
[20198]
>>  to integer! #{2710}         
 
== 10000


>>  to decimal! #{2710}
 
== 4.94065645841247e-320
Sunanda
16-Dec-2009
[20199]
decimal is broken with issue.
PeterWood
16-Dec-2009
[20200]
>>  to decimal! to binary! 10000
 
== 4.94065645841247e-320

>>  to integer! to binary! 10000 

== 10000
Sunanda
16-Dec-2009
[20201]
Probably explained by this Curecode report
   http://www.curecode.org/rebol3/ticket.rsp?id=547
PeterWood
16-Dec-2009
[20202]
There is something that doesn't seem correct about the following: 
:

>> to decimal! to integer! to binary! 10000

== 10000.0

>>  to decimal! to binary! 10000         
 
== 4.94065645841247e-320
Steeve
16-Dec-2009
[20203x2]
not a bug, as explained in the curecode ticket
decimals and integers are not hard coded the same way in a binary
Micha
16-Dec-2009
[20205]
00
Ladislav
16-Dec-2009
[20206]
Peter: 

>> to binary! 10000.0
== #{40C3880000000000}

>> to binary! 10000
== #{0000000000002710}

Why do you think they should be the same?
PeterWood
16-Dec-2009
[20207x2]
I don't. It's the behaviour of to-decimal that I don't understand;

>> to decimal! #{0000000000002710}
== 4.94065645841247e-320
In R2 this would raise a script error:

>> to decimal! to binary! 10000   
** Script Error: Invalid argument: #{3130303030}
** Near: to decimal! to binary! 10000
Geomol
16-Dec-2009
[20209]
In R2:
>> to binary! 10000
== #{3130303030}

So we get the ascii value of each digit in the number. In R3:

>> to binary! 10000
== #{0000000000002710}


The number is seen as a 64-bit integer, and we get the binary representation 
of that.
Ladislav
16-Dec-2009
[20210x2]
then why do you think, that this is a bug?:

>> to decimal! to binary! 10000.0
== 10000.0
If you want to know more, read http://www.rebol.net/wiki/Decimals-64
BrianH
16-Dec-2009
[20212]
Maxim: "what's the best way to convert a hex string to a decimal 
value in R3?" - Try this:
>> pi
== 3.14159265358979
>> enbase/base to-binary pi 16
== "400921FB54442D18"
>> to-decimal debase/base "400921FB54442D18" 16
== 3.14159265358979
>> to-decimal debase/base enbase/base to-binary pi 16 16
== 3.14159265358979


You asked for the best way: No method that uses the issue! type for 
binary conversions could be considered the best way.
Sunanda
16-Dec-2009
[20213]
Not sure what specific data transformation Maxim needs.

I'd be interested in the REBOL code to do would would seem to be 
a simple transformation:
  How do I get from:

    #{0100}   ;; two bytes of binary, perhaps read from an external file
   to:

    256          ;; the equivalent integer (given the implicit endian 
    assumptions)
Steeve
16-Dec-2009
[20214x2]
if it's for storing usage, then restoring, perhaps the base 64 format 
is enough.

>> enbase to-binary pi
== "QAkh+1RELRg="

>> to-decimal debase enbase to-binary pi
== 3.14159265358979
and it take less size (besides, it has always the same length)
BrianH
16-Dec-2009
[20216x2]
Ah, it's those implicit endian assumptions that can trip you up, 
Sunanda. But here's a good method that for big-endian 64-bit on R3:
>> to-integer #{0100}
== 256
that for -> that works for
Sunanda
16-Dec-2009
[20218]
Thanks, Brian......I'd tried that, and it did not work for me. Must 
have been having one of those blond moments, because it works now 
:)
Steeve
16-Dec-2009
[20219]
newb instant ;-)
BrianH
16-Dec-2009
[20220]
It should work for 32-bit integers too, but there isn't yet a build 
that supports them (though the source does).
Steeve
16-Dec-2009
[20221]
U mean something like this ?
>> to-integer #{01000000}
== 16777216
BrianH
16-Dec-2009
[20222x3]
I mean 32-bit integer! type, not 32-bit binary encoding converting 
to the current 64-bit integer! type.
It's supposed to be an R3 build option for embedded of legacy systems 
that don't support 64-bit integer math.
of -> or
Steeve
16-Dec-2009
[20225x4]
and my connversion is not exact, because we don't have the sign bit 
expansion
Nice puzzle.

Giving a signed 16bit binary integer, give the corresponding R3 integer.
A one-liner solution (without any test) would be  appreciated
I don't like it very much, be i think i got something...


>> signed16: funco [n][(complement -1 + shift n -15) and -65536 or 
n]
>> signed16 to-integer #{FFFF}
== -1
BrianH
16-Dec-2009
[20229]
Well, that math solution is better than the binary manipulations 
I was doing.
Steeve
16-Dec-2009
[20230]
Idea comming from a forth kernel i've done
Maxim
16-Dec-2009
[20231]
I like the discussion occuring because of my simple question... all 
of your answers are pre-emptively solving other issues I was going 
to have in the future.  :-)
Steeve
16-Dec-2009
[20232x2]
Argh.... I felt vaguely, that there was an easier way

>> signed16: funco [n][32767 - n xor 32767]
>> signed16 to-integer #{FFFF}
== -1
32767 is found like this for signed 16bit integers:
2 ** 15 - 1