views:

216

answers:

4

Hi

I'm doing this:

@snippets = Snippet.find :all, :conditions => { :user_id => session[:user_id] }

@snippets.each do |snippet|
  snippet.tags.each do |tag|
    @tags.push tag
  end
end

But if a snippets has the same tag two time, it'll push the object twice.

I want to do something like if @tags.in_object(tag)[...]

Would it be possible? Thanks!

A: 

I'm assuming @tags is an Array instance.

Array#include? tests if an object is already included in an array. This uses the == operator, which in ActiveRecord tests for the same instance or another instance of the same type having the same id.

Alternatively, you may be able to use a Set instead of an Array. This will guarantee that no duplicates get added, but is unordered.

Phil Ross
A: 

You can probably add a group to the query:

Snippet.find :all, :conditions => { :user_id => session[:user_id] }, :group => "tag.name"

Group will depend on how your tag data works, of course.

Or use uniq:

@tags << snippet.tags.uniq
Toby Hede
+1  A: 

Another way would be to simply concat the @tags and snippet.tags arrays and then strip it of duplicates.

@snippets.each do |snippet|
  @tags.concat(snippet.tags)
end

@tags.uniq!
m5h
With your code I still have the same tags many times.And what the use of ! in @tags.uniq? Thanks
Tom
The ! means it will change @tags rather than just return a new array. If your tags are not simple strings you need to implement the hash and eql? methods in your Tag class so that uniq treats tags based on the name of the tag.
m5h
+1  A: 

I think there are 2 ways to go about it to get a faster result.

1) Add a condition to your find statement ( in MySQL DISTINCT ). This will return only unique result. DBs in general do much better jobs than regular code at getting results.

2) Instead if testing each time with include, why don't you do uniq after you populate your array.

here is example code

ar = []
data = []

#get some radom sample data
100.times do 
data << ((rand*10).to_i) 
end

# populate your result array
# 3 ways to do it.
# 1) you can modify your original array with 

data.uniq!

# 2) you can populate another array with your unique data
# this doesn't modify your original array
ar.flatten << data.uniq

# 3) you can run a loop if you want to do some sort of additional processing

data.each do |i|
 i = i.to_s + "some text" # do whatever you need here
 ar << i
end

Depending on the situation you may use either.

But running include on each item in the loop is not the fastest thing IMHO

Good luck

Nick Gorbikoff
Not to speak out of thin air I just ran a small test with 1mil integer values ( and resetting arrays each time). On average out of 3 proposed solution to get unique items into array - my first proposal is the fastest with, being about 30% to 70% faster than second and being 4-5 times faster than the 3. Also keep in mind that if you are comparing complex objects - it's going to be even slower - so to run a distinct on the DB is still the fastest solution
Nick Gorbikoff