I am writing a CLI utility in C that analyzes PNG files and outputs data about it. More specifically, it prints out the length, CRC and type values of each chunk in the PNG file. I am using the official specification for the PNG file format and it says that each chunk has a CRC value encoded in it for data integrity.
My tool is running fine and it outputs the correct values for length and type and outputs what appears to be a correct value for the CRC (as in it is formatted as 4-bytes hexadecimal) - the only problem is that everytime I run this program, the value of the CRC changes. Is this normal, and if not what could be causing it?
Here is the main part of the code
CHUNK chunk; BYTE buffer; int i = 1; while (chunk.type != 1145980233) { // 1145980233 is a magic number that signals our program that IEND chunk // has been reached it is just the decimal equivalent of 'IEND' printf("============\nCHUNK: %i\n", i); // Read LENGTH value; we have to buffer and then append to length hexdigit-by-hexdigit to account for // reversals of byte-order when reading infile (im not sure why this reversal only happens here) for(unsigned j = 0; j < 4; ++j) { fread(&buffer, 1, sizeof(BYTE), png); chunk.length = (chunk.length | buffer)<<8; // If length is 0b4e and buffer is 67 this makes sure that length // ends up 0b4e67 and not 0b67 } chunk.length = chunk.length>>8; // Above bitshifting ends up adding an extra 00 to end of length // This gets rid of that printf("LENGTH: %u\n", chunk.length); // Read TYPE value fread(&chunk.type, 4, sizeof(BYTE), png); // Print out TYPE in chars printf("TYPE: "); printf("%c%c%c%c\n", chunk.type & 0xff, (chunk.type & 0xff00)>>8, (chunk.type & 0xff0000)>>16, (chunk.type & 0xff000000)>>24); // Allocate LENGTH bytes of memory for data chunk.data = calloc(chunk.length, sizeof(BYTE)); // Populate DATA for(unsigned j = 0; j < chunk.length; ++j) { fread(&buffer, 1, sizeof(BYTE), png); } // Read CRC value for(unsigned j = 0; j < 4; ++j) { fread(&chunk.crc, 1, sizeof(BYTE), png); } printf("CRC: %x\n", chunk.crc); printf("\n"); i++; } here are some preprocessor directives and global variables
#define BYTE uint8_t typedef struct { uint32_t length; uint32_t type; uint32_t crc; BYTE* data; } CHUNK; here are some examples of the output I am getting
Run 1 -
============ CHUNK: 1 LENGTH: 13 TYPE: IHDR CRC: 17a6a400 ============ CHUNK: 2 LENGTH: 2341 TYPE: iCCP CRC: 17a6a41e Run 2 -
============ CHUNK: 1 LENGTH: 13 TYPE: IHDR CRC: 35954400 ============ CHUNK: 2 LENGTH: 2341 TYPE: iCCP CRC: 3595441e Run 3 -
============ CHUNK: 1 LENGTH: 13 TYPE: IHDR CRC: 214b0400 ============ CHUNK: 2 LENGTH: 2341 TYPE: iCCP CRC: 214b041e As you can see, the CRC values are different each time, yet within each run they are all fairly similar whereas my intuition tells me this should not be the case and the CRC value should not be changing.
Just to make sure, I also ran
$ cat test.png > file1 $ cat test.png > file2 $ diff -s file1 file2 Files file1 and file2 are identical so accessing the file two different times doesnt change the CRC values in them, as expected.
Thanks,
https://stackoverflow.com/questions/66738640/value-of-crc-changing-everytime-program-is-run March 22, 2021 at 07:29AM
没有评论:
发表评论