Potential problem for not having the ability to load Bitmaps above 1mb: #1790
elkaamee326
started this conversation in
Ideas
Replies: 1 comment 7 replies
-
I believe we have testes and can read files larger than one cluster fine. For example reading/writing 32000 characters works without problem. Therefore I think the problem we are facing at the 1mb boundary is a different one. |
Beta Was this translation helpful? Give feedback.
7 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
The amount of bytes per cluster is 4096. When we load bitmaps above 1mb, it tries to read all of the bytes in the file and it uses up that allocated amount. Since, there are no more bytes per cluster that can be allocated after 4096 bytes, it just shuts down because it reached above the hardcoded amount. This was discovered and located in this function:
public uint[] GetFatChain(uint aFirstEntry, long aDataSize = 0)
I tried running a bitmap that is less than 1mb and it didn't even reach 4096 bytes in the
xEntryOffset
variable. To prove that the problem occurs here, I put a series ofGlobal.mDebugger.Send("");
statements inside most / all of the defined variables and inside of some auxilliary functions too. When it is done reading all the bytes, it gets out of the while loop:while (!FatEntryIsEof(xValue)) { ... }
Would a solution be to allow larger files to access more than 1 cluster to read from? Or could we just clear the data in one cluster when its full, similar to something like this:
if (xEntryOffset == 4096) { ClearClusterData(Params param); }
and then execute that same for loop again (but, for the case of a bitmap, we can send that previous data to pixel memory and then everytime a cluster is full we send chunks of data)? I am pretty new to this concept, but I really hope this notification is useful.Beta Was this translation helpful? Give feedback.
All reactions