Hello all,
I have been told I need to process a list of 31,000 integers that are stored in memory on my computer at one integer per word. One word is 2 bytes. I estimate that 20% of the instructions are unary. I am trying to figure out how I can calculate the maximum number of instructions I can use in the program that processes the data. Note the program shares memory with the OS and my data.