Hi,
guess it's sorting a big bunch of data that's the problem here. So just avoid sorting a big bunch of data ;-)
This looked interesting, so I played with it a bit. See my results:
To avoid having to sort huge amounts of data I create an array with the line numbers as keys at first, and add 2 random values to each key. Then I sort this array 2 times, by each of the values. This gives a list containing each possible line number:
Code: Select all
function getLiNus theNum
   /* Creates a list containing all numbers from 1 to "theNum",
   in randomized order, each on its line   */
   
   put 1 into myCnt
   repeat theNum
      put random(99999) into myArr[myCnt][1]          --  create  sort array
      put random(99999) into myArr[myCnt][2]
      add 1 to myCnt
   end repeat
   
   get the keys of myArr
   sort lines of it numeric by myArr[each][1]       --  sort it
   sort lines of it numeric by myArr[each][2]
   
   return it
end getLiNus
This is sufficiently fast, and sufficiently randomized (you might play with the number of keys, and the random(x) value). And this is how you utilize it:
Code: Select all
on mouseUp
   answer file "Which file?"
   if it is empty then exit mouseUp
   
   put URL ("file:" & it) into myData
   put 1 into myCnt
   repeat for each line L in myData                      --  create data array
      put L into myDatArr[myCnt]
      add 1 to myCnt
   end repeat
   
   put getLiNus(the number of lines of myData) into mySort    --  fetch the sort list   
   repeat for each line L in mySort                           --  and apply it
       put myDatArr[L] & CR after myRes
   end repeat
   delete char -1 of myRes
   put myRes
end mouseUp
Hint: For easy access this code is stripped of all not really necessary. For instance, each repeat should start with:
Code: Select all
      wait 0 millisec with messages
      if the controlKey is down then exit repeat
A problem I ran in is that it becomes cumbersome to fetch the single lines of our text data - once it's a really phat chunk with thousands of lines. So I start with throwing all lines of our data into an array first, with [lineNumber] as key. This way I can access them much faster later, once I have my sort order :)
For a file with ~55MB & ~1.000.000 lines I get (millisecs):
Code: Select all
Make data array: 4757 | Rearrange lines of data: 5570
Make sort array: 6295 | Sort sort array: 3924
Over all: 20867
Lines: 1000436
Remark: Even with "wait 0 with messages" this still runs into "unresponsiveness" during the last loop (rearranging the lines).
Anyway, perhaps something here is useful for someone. Have fun!