views:

20226

answers:

9

I have seen a couple of questions related to string concatenation in SQL. I wonder how would you approach the opposite problem: splitting coma delimited string into rows of data:

Lets say I have tables:

userTypedTags(userID,commaSeparatedTags) 'one entry per user
tags(tagID,name)

And want to insert data into table

userTag(userID,tagID) 'multiple entries per user

Inspired by Which tags are not in the database? question

EDIT

Thanks for the answers, actually more then one deserves to be accepted but I can only pick one, and the solution presented by Cade Roux with recursions seems pretty clean to me. It works on SQL Server 2005 and above.

For earlier version of SQL Server the solution provided by miies can be used. For working with text data type wcm answer will be helpful. Thanks again.

+9  A: 

I use this function (SQL Server 2005 and above).

create function [dbo].[Split]
(
    @string nvarchar(4000),
    @delimiter nvarchar(10)
)
returns @table table
(
    [Value] nvarchar(4000)
)
begin
    declare @nextString nvarchar(4000)
    declare @pos int, @nextPos int

    set @nextString = ''
    set @string = @string + @delimiter

    set @pos = charindex(@delimiter, @string)
    set @nextPos = 1
    while (@pos <> 0)
    begin
        set @nextString = substring(@string, 1, @pos - 1)

        insert into @table
        (
            [Value]
        )
        values
        (
            @nextString
        )

        set @string = substring(@string, @pos + len(@delimiter), len(@string))
        set @nextPos = @pos
        set @pos = charindex(@delimiter, @string)
    end
    return
end
Thanks for that. I think that will also work in SQLServer 2000
kristof
You're right. I thought table-valued functions were introduced with SQL Server 2005, but they weren't.
@commaCheck is not used, other than an assignment, so it can be removed.
David
Also,set @string = substring(@string, @pos + 1, len(@string))should beset @string = substring(@string, @pos + len(@delimiter), len(@string))
David
A: 

sounds like you need to separate the that column into it's own table. If tags are only stored in a delimited list, how would you write efficient sql to find records associated with a specific tag?

Joel Coehoorn
That is the idea behind the question, separating list of tags string to become set of records.
kristof
Okay, that makes more sense then. So I should be hounding your predecessor for creating that mess in the first place ;)
Joel Coehoorn
+1  A: 

I wrote this awhile back. It assumes the delimiter is a comma and that the individual values aren't bigger than 127 characters. It could be modified pretty easily.

It has the benefit of not being limited to 4,000 characters.

Good luck!

ALTER Function [dbo].[SplitStr] ( 
        @txt text 
) 
Returns @tmp Table 
        ( 
                value varchar(127)
        ) 
as 
BEGIN 
        declare @str varchar(8000) 
                , @Beg int 
                , @last int 
                , @size int 

        set @size=datalength(@txt) 
        set @Beg=1 


        set @str=substring(@txt,@Beg,8000) 
        IF len(@str)<8000 set @Beg=@size 
        ELSE BEGIN 
                set @last=charindex(',', reverse(@str)) 
                set @str=substring(@txt,@Beg,8000-@last) 
                set @Beg=@Beg+8000-@last+1 
        END 

        declare @workingString varchar(25) 
                , @stringindex int 



        while @Beg<=@size Begin 
                WHILE LEN(@str) > 0 BEGIN 
                        SELECT @StringIndex = CHARINDEX(',', @str) 

                        SELECT 
                                @workingString = CASE 
                                        WHEN @StringIndex > 0 THEN SUBSTRING(@str, 1, @StringIndex-1) 
                                        ELSE @str 
                                END 

                        INSERT INTO 
                                @tmp(value)
                        VALUES 
                                (cast(rtrim(ltrim(@workingString)) as varchar(127)))
                        SELECT @str = CASE 
                                WHEN CHARINDEX(',', @str) > 0 THEN SUBSTRING(@str, @StringIndex+1, LEN(@str)) 
                                ELSE '' 
                        END 
                END 
                set @str=substring(@txt,@Beg,8000) 

                if @Beg=@size set @Beg=@Beg+1 
                else IF len(@str)<8000 set @Beg=@size 
                ELSE BEGIN 
                        set @last=charindex(',', reverse(@str)) 
                        set @str=substring(@txt,@Beg,8000-@last) 
                        set @Beg=@Beg+8000-@last+1 

                END 
        END     

        return
END
wcm
+1  A: 
SELECT substring(commaSeparatedTags,0,charindex(',',commaSeparatedTags))

will give you the first tag. You can proceed similarly to get the second one and so on by combining substring and charindex one layer deeper each time. That's an immediate solution but it works only with very few tags as the query grows very quickly in size and becomes unreadable. Move on to functions then, as outlined in other, more sophisticated answers to this post.

Yann Semet
+17  A: 

There are a wide varieties of solutions to this problem documented here, including this little gem:

CREATE FUNCTION dbo.Split (@sep char(1), @s varchar(512))
RETURNS table
AS
RETURN (
    WITH Pieces(pn, start, stop) AS (
      SELECT 1, 1, CHARINDEX(@sep, @s)
      UNION ALL
      SELECT pn + 1, stop + 1, CHARINDEX(@sep, @s, stop + 1)
      FROM Pieces
      WHERE stop > 0
    )
    SELECT pn,
      SUBSTRING(@s, start, CASE WHEN stop > 0 THEN stop-start ELSE 512 END) AS s
    FROM Pieces
  )
Cade Roux
Fantastic function. Could do with using nchar() and nvarchar(). Also see below for suggestion with variable-length delimiter.
Rory
On SQL Server 2008, this version fails when there are more than 101 items in the list:"The statement terminated. The maximum recursion 100 has been exhausted before statement completion."
Mike Schenk
@MikeSchenk You can use the OPTION (MAXRECURSION n) hint (http://msdn.microsoft.com/en-us/library/ms181714.aspx) to change the level of recursion - however, it's not allowed in UDF definitions. This question (http://social.msdn.microsoft.com/forums/en-US/transactsql/thread/7c7d5fea-38ad-4bc5-9038-a157e640561f) would imply that you can specify it outside the UDF and have it still work.
Cade Roux
Warning: fails with larger input strings (above about 1000 characters). "The maximum recursion 100 has been exhausted before statement completion."
cbp
I realize I'm jumping into this incredibly late... like 1.5 years, but there's a pretty interesting solution that doesn't require any recursion that wasn't mentioned here. I've posted it below.
md5sum
+5  A: 

Slight modification of the solution above so it works with variable-length delimiters.

create FUNCTION dbo.fn_Split2 (@sep nvarchar(10), @s nvarchar(4000))
RETURNS table
AS
RETURN (
    WITH Pieces(pn, start, stop) AS (
      SELECT 1, 1, CHARINDEX(@sep, @s)
      UNION ALL
      SELECT pn + 1, stop + (datalength(@sep)/2), CHARINDEX(@sep, @s, stop + (datalength(@sep)/2))
      FROM Pieces
      WHERE stop > 0
    )
    SELECT pn,
      SUBSTRING(@s, start, CASE WHEN stop > 0 THEN stop-start ELSE 4000 END) AS s
    FROM Pieces
  )

NB: I've used datalength() since len() reports incorrectly if there are trailing spaces.

Rory
+7  A: 

You can also achieve this effect using XML, as seen here, which removes the limitation of the answers provided which all seem to include recursion in some fashion. The particular use I've made here allows for up to a 32-character delimiter, but that could be increased however large it needs to be.

CREATE FUNCTION [dbo].[Split] (@separator VARCHAR(32), @string VARCHAR(MAX))

RETURNS @t TABLE
    (
        val VARCHAR(MAX)
    )   
AS
    BEGIN
        DECLARE @xml XML
        SET @XML = N'<root><r>' + REPLACE(@s, @sep, '</r><r>') + '</r></root>'

        INSERT INTO @t(val)
        SELECT r.value('.','VARCHAR(5)') as Item
        FROM @xml.nodes('//root/r') AS RECORDS(r)

        RETURN
    END

Then you can invoke it using:

SELECT * FROM dbo.Split(' ', 'I hate bunnies')

Which returns:

-----------
|I        |
|---------|
|hate     |
|---------|
|bunnies  |
-----------


I should note, I don't actually hate bunnies... it just popped into my head for some reason.
The following is the closest thing I could come up with using the same method in an inline table-valued function. DON'T USE IT, IT'S HORRIBLY INEFFICIENT! It's just here for reference sake.

CREATE FUNCTION [dbo].[Split] (@sep VARCHAR(32), @s VARCHAR(MAX))
RETURNS TABLE
AS
    RETURN
    (
        SELECT r.value('.','VARCHAR(5)') as Item
        FROM (SELECT CONVERT(XML, N'<root><r>' + REPLACE(@s, @sep, '</r><r>') + '</r></root>') as valxml) x
        CROSS APPLY x.valxml.nodes('//root/r') AS RECORDS(r)
    )
md5sum
@md5sum: Great solution!
p.campbell
@md5sum - if you could get this into an inline table-valued function, I'd be onboard. Typically scalar functions perform horribly on SQL Server. I'd love to see this benchmarked against the ITVF solution.
Cade Roux
@Cade Roux - I'm sorry, but I don't think I understand exactly what you're asking here... this is a table-valued function. Unless you're just wanting it to return the selected value rather than inserting it into the temp table?
md5sum
@md5sum - inline table valued functions (without a BEGIN) generally perform poorer and are handled completely different than multi-statement table-valued functions (as this is). Of course you never can tell untill you benchmark them...
Cade Roux
@md5sum I see I mentioned scalar functions - I probably forgot to finish my thought - performance on UDFs generally is best on inline TVF, then multi-statement TVF, then scalar functions (which are completely horrible).
Cade Roux
@Cade Roux - The inline version (the best incarnation I could come up with) is a complete joke in comparison. I even tried it a couple extra times just to ensure that something hadn't kicked off on the server to screw with my results. Version posted above splitting 3168 rows of "I HATE BUNNIES" consistently takes ~110ms. The inline version using the same parameters consistently takes ~1:25.175. I can post the inline version if you'd like. I may have done something horribly wrong when I built it, but as far as I can tell I did it the only proper way possible.
md5sum
@md5sum I'm late voting you up, but wanted to note that I'd be interested to see how it scales for a lot more rows. I know that even simple scalar functions can perform horribly versus inline code on hundreds of thousands of rows.
Cade Roux
@Cade Roux - I'm not really sure. I know that I've used it for around 1000 and it performs fine. However, I would argue that SQL Server isn't the place to do that amount of string manipulation anyway (at least not in any reasonable application architecture design patterns).
md5sum
@md5sum I agree this is probably a modeling/design issue. However, if you've got terabytes of data sometimes you don't want to pull them into yet another system to do something with them. Obviously the design of the data should be improved to facilitate the anticipated processing. Sometimes it's not so easy to stream millions of rows through something better than T-SQL. Which is why design defects are 1000x times more difficult to compensate for when discovered in production stage of life-cycle.
Cade Roux
A: 

This is the logic I am using since 4 years... Check this out. http://praveenbattula.blogspot.com/2010/06/best-split-udf-function-with-delimeter.html

Rare Solutions
+2  A: 

For the particular case of splitting strings into words I've come across another solution for SQL Server 2008.

with testTable AS
(
SELECT 1 AS Id, N'how now brown cow' AS txt UNION ALL
SELECT 2, N'she sells sea shells upon the sea shore' UNION ALL
SELECT 3, N'red lorry yellow lorry' UNION ALL
SELECT 4, N'the quick brown fox jumped over the lazy dog'
)

SELECT display_term, COUNT(*) As Cnt
 FROM testTable
CROSS APPLY sys.dm_fts_parser('"' + txt + '"', 1033, 0,0)
GROUP BY display_term
HAVING COUNT(*) > 1
ORDER BY Cnt DESC

Returns

display_term                   Cnt
------------------------------ -----------
the                            3
brown                          2
lorry                          2
sea                            2
Martin Smith