I am attempting to use objcopy to convert an xml file to an object file that is then linked into and used by another shared library on RHEL5. I convert the file with this command:
objcopy --input-format binary --output-target i386-pc-linux-gnu --binary-architecture i386 baselines.xml baselines.0
The object file is created and using readelf I get the following:
Symbol table '.symtab' contains 5 entries: Num: Value Size Type Bind Vis Ndx Name 0: 00000000 0 NOTYPE LOCAL DEFAULT UND 1: 00000000 0 SECTION LOCAL DEFAULT 1 2: 00000000 0 NOTYPE GLOBAL DEFAULT 1 _binary_baselines_xml_sta 3: 0000132b 0 NOTYPE GLOBAL DEFAULT 1 _binary_baselines_xml_end 4: 0000132b 0 NOTYPE GLOBAL DEFAULT ABS _binary_baselines_xml_siz
So it looks like the size is in there. I dumped the file and verified the xml is embedded as ascii at offset 34 (specified by the .data value) and that it's correct. The data is 0x132b bytes in size, as specified by the variable.
Then in the code, I declare a couple variables:
extern "C"
{
extern char _binary_baselines_xml_start;
extern char _binary_baselines_xml_size;
}
static const char* xml_start = &_binary_baselines_xml_start;
const uint32_t xml_size = reinterpret_cast<uint32_t>(&_binary_baselines_xml_size);
When I step into this, the xml pointer is correct and I can see the xml text in the debugger. However, the size symbol shows the value as 0x132b (which is what I want) but it also indicates that "Address 0x132b is out of bounds". When I use the variable it is a very large incorrect random number. I've tried all sorts of other syntax to declare the extern variable such as char*, char[], int, int*, etc. The result is always the same. The value is there but I can't seem to get to it.
Another point of interest is that this code works fine on a windows machine without the prepended underscore on the extern variables but all else the same.
I can't seem to find much online about using objcopy in this manner so any help is greatly appreciated.