as i run my ruby script, which is an very long series of loop. for each loop, some random html file is parsed via nokogiri.
top reveals that memory consumption % is incrementing via 0.1 along with cpu usage every few seconds.
eventually the ruby script crashes due to "not enough memory"
UPDATED to latest:
def extract(newdoc, newarray)
doc = Nokogiri::HTML(newdoc)
collection = ''
collection = newarray.map {|s| doc.xpath(s)}
dd = "";
(0...collection.first.length).each do |i|
(0...collection.length).each do |j|
dd += collection[j][i].to_s
end
end
collection = ''
newarray = ''
doc = ''
puts dd.chop + "\n"
end
for 1..100000
extract("somerandomHTMLfile", ["/html/body/p", "/html/body/h1"])
end