Hi,
The binaryencode and decode commands are one of the most confusing there are in LC
You can type that again!
I have found a tutorial about reading binary files at
http://lessons.runrev.com/spaces/lesson ... nary-File- and by Trevor Devor. My code is now working but I still don't understand quite why

, based on the dictionary description of binaryDecode, my code now looks like this :
Code: Select all
On LoadFile
put "" into fld"debug"
--Prompt for input file (note the input file is a binary flle with no line endings
answer file "Select the text file" with type "mux"
if the result is "Cancel" then exit LoadFile
put "binfile:" & it into tURL -- merges the user selected filepath with the binary file command prefix ready for next command:
put URL tURL into theBinaryData -- store the contents of file into the variable tBinaryData
put "" into tHex --the conversion routine needs a var
-- put binaryDedcode("h*",tFiledata, tHex) into fld"debug" -- does not work
## Assume that binary data is stored in the variable theBinaryData
put 1 into tCount -- I will only process the first 16 bytes of data (at present)
--parse first 16 bytes and display as a list and also on a line in a field
repeat until tCount=17
put byte tCount of theBinaryData into theByte --where is byte defined, dictionary search does not find it ?
put theByte after theText --add the byte (a single character) to a string
put binarydecode("H*", theByte, tHex) into theNumConversions -- from tutorial, tHex is the value needed, what is theNumConversions for.
put tHex & " " after theHex -- store the hex decode for future display on a line
put tHex after fld"debug" -- display hex value in the field
put "- " & theByte & "-- " & theNumConversions & CR after fld"debug" -- display character as well.
put tCount+1 into tCount -- next byte
end repeat
-- simulated hex editor type display single line
put theHex & " == " & theText after fld"debug"
end LoadFile
The line "put binarydecode("H*", thebyte, tHex) into theNumConversions is taken from the tutorial. NumConversions is set to 1 , why?
The dictionary describes the first parameter of the binary decode as a formatlist, the code above uses H. This is defined as " H: convert next amount bytes of data to hexadecimal digits, starting at the high end of each byte"
The first character in my test file is "@" and my hex editor and the web decodes it as h40 or d64. I interpret the dictionary definition to require me to enter the number of bytes to be converted which in this case is 1. If I enter H1 tHex is set to 4 rather than 40, if I enter H2 then tHex is set correctly to 40. The dictionary also includes an important note which boils down to an example of using h3 as the formatlist. It states that h3 requires 3 variables to be defined as each byte requires its own variable, this seems to be contrary to what I am seeing as H2 or h2 causes a single byte to be decoded into a two character hex number. The rest of the dictionary entry seems to be designed to confuse e.g.
The amount corresponding to each dataType is an integer or the * character.
then
If you specify an amount with a stringdataType (a or A),
but you just said its an integer......? Any comments as my head hurts ?
Simon