After searching around a bit and consulting the Dinosaur Book, I've come to SO seeking wisdom. Note that this is somewhat homework-related, but actually isn't a homework problem. Also, this is using the C programming language.
I'm working with a kernel that currently allocates memory in 4K chunks. In an attempt to cut down on wasted memory, I've rewritten what I think malloc to be like, where it'll grab a 4K page and then give out memory from that, as needed. That part is currently working fine. I plan to have a linked list of pages of memory. Memory is handled as a char*, so my struct has a char* in it. It also currently has some ints describing it, as well as a pointer to the next node.
The question is this: I plan to use a bit vector to keep track of free and used memory. I want to figure out how many integers (4 bytes, 32 bits) I need to keep track of all the 1 byte blocks in the page of memory. So 1 bit in the bit vector will correspond to 1 byte in the page. The catch is that I must fit this all within the 4K I've been allocated, so I need to figure out the number of integers necessary to satisfy the 1-bit-per-byte constraint and fit in the 4K.
Or rather, I need to maximize the "actual" memory I'll have, while minimizing the number of integers required to map one bit per byte, while both parts ("actual" memory and bit vector) are in the same page.
Due to information about the page, and the pointers, I won't actually have 4K to play with, but something closer to 4062 bytes.
I believe this to be a linear programming problem, but the formulations of it that I've tried haven't worked out.