views:

549

answers:

5

I have a multi-dimensional array that I'd like to use for building an xml output.

The array is storing a csv import. Where people[0][...] are the column names that will become the xml tags, and the people[...>0][...] are the values.

For instance, array contains:
people[0][0] => first-name
people[0][1] => last-name
people[1][0] => Bob
people[1][1] => Dylan
people[2][0] => Sam
people[2][1] => Shepard

XML needs to be:
<person>
  <first-name>Bob</first-name>
  <last-name>Dylan</last-name>
</person>
<person>
  <first-name>Sam</first-name>
  <last-name>Shepard</last-name>
</person>

Any help is appreciated.

+7  A: 

I suggest using FasterCSV to import your data and to convert it into an array of hashes. That way to_xml should give you what you want:

people = []
FasterCSV.foreach("yourfile.csv", :headers => true) do |row|
 people << row.to_hash
end
people.to_xml
Roel
Nice solution, +1
fd
Got this error: undefined method `to_hash' for #<Array:0x2506840>
Jeffrey
Jeffery - the above code will work if you are using the Rails environment or script/console. If you are using just the basic Ruby interpreter or irb, it will give you the above error. You will need to include some modules / libraries for xml support if using just Ruby / irb.
Phil
I'm getting this error inside of rails. I've tried the code inside of a controller and, to test, inside of a view; both have the same error: undefined method 'to_hash'
Jeffrey
Sorry about that, I edited the code. I forgot to add :headers => true. Hopefully that solves your undefined method error.
Roel
Thanks Roel. That worked for importing a local copy of a csv file, but I'm stuck figuring out where to include the :headers => true when importing a csv file from a remote location. url = 'http://remote-url.com/file.csv' people = [] open(url) do |f| f.each_line do |line| FasterCSV.parse(line) do |row| people << row.to_hash end end end people.to_xml
Jeffrey
please see below
Jeffrey
+2  A: 

There are two main ways I can think of achieving this, one using an XML serializer; the second by pushing out the raw string.

Here's an example of the second:

xml = ''
1.upto(people.size-1) do |row_idx|
  xml << "<person>\n"
  people[0].each_with_index do |column, col_idx|
    xml << "  <#{column}>#{people[row_idx][col_idx]}</#{column}>\n"
  end
  xml << "</person>\n"
end

Another way:

hash = {}
hash['person'] = []
1.upto(people.size-1) do |row_idx|
  row = {}
  people[0].each_with_index do |column, col_idx|
    row[column]=people[row_idx][col_idx]
  end
  hash['person'] << row
end
hash.to_xml

Leaving this answer here in case someone needs to convert an array like this that didn't come from a CSV file (or if they can't use FasterCSV).

fd
fd - the second one works great with the array. is there any way to add people[0][x].parameterize to the code?
Jeffrey
Assuming I understand correctly, change row[column]=... to row[column.parameterize]=...
fd
Thanks, that does work. Though I'm moving away from using a hash because the display order is important... unless there is a work around for that? The app is using rails 2.3.4.
Jeffrey
+1  A: 

Using Hash.to_xml is a good idea, due to its support in the core rails. It's probably the simplest way to export Hash-like data to simple XML. In most, simple cases - more complex cases requires more complex tools.

Henryk Konsek
A: 

Thanks Roel. That worked for importing a local copy of a csv file, but I'm stuck figuring out where to include the :headers => true when importing a csv file from a remote location.

url = 'http://remote-url.com/file.csv'
people = []
open(url) do |f|
  f.each_line do |line|
 FasterCSV.parse(line) do |row|
     people << row.to_hash
 end
  end
end
people.to_xml
Jeffrey
From you example, I assume you can expect to load the whole file into memory. That being said, try this:FasterCSV.foreach(open(url).string, :headers => true) do |row| people << row.to_hashendHowever, if the file is large, we will need to take a different approach.
Roel
The files are quite large, when I tried the code a load message came up saying it was too long. Any other ideas?
Jeffrey
A: 

Thanks to everyone that posted. Below is the solution that seems to work best for my needs. Hopefully others may find this useful.

This solution grabs a remote url csv file, stores it in a multi-dimensional array, then exports it as xml:

require 'rio'
require 'fastercsv'

url = 'http://remote-url.com/file.csv'
people = FasterCSV.parse(rio(url).read)

xml = ''
1.upto(people.size-1) do |row_idx|
  xml << "  <record>\n"
  people[0].each_with_index do |column, col_idx|
    xml << "    <#{column.parameterize}>#{people[row_idx][col_idx]}</#{column.parameterize}>\n"
  end
  xml << "  </record>\n"
end

There are better solutions out there, using hash.to_xml would have been great except I needed to change the csv index line to parameterize to use as a xml tag, but this code works so I'm happy.

Jeffrey