views:

31

answers:

3

Problem Statement:

URLs are stored in a database, example:

home/page1
gallery/image1
info/IT/contact
home/page2
home/page3
gallery/image2
info/IT/map

and so on.

I would like to arrange the above urls into a tree fashion as shown below (each item will be a url link). The final output would be a simple HTML List (plus any sub list(s))

thus:

home         gallery           info
  page1         image1           IT
  page2         image2            contact
  page3                           map

Programming Language is C# , platform is asp.net

EDIT 1:

In the above example, we end up with Three Lists because in our example there is three main 'groups' eg: home, gallery, info.

Naturally, this can change, the algorithm needs to be able to somehow build the lists recursively..

A: 

Well,sorting those strings need a lot of work,I've done something similar to your condition.I wish to share the strategy with you.

First of all,(if you can change design of your tables indeed)
Create a table URL like below

----------------
|   URL Table  |
----------------
|  ID          | 
|  ParentID    |
|  Page        |
|..extra info..|
----------------

It's an implementation of category and sub category in same table.In a similar manner,you can contain insert a lot of page and subpage.For example,

-------------------------------------
|  ID  |   ParentID   |    Page     | ...
 ------------------------------------
|   0  |     null     |    Home     |
|   1  |     null     |   Gallery   |
|   2  |     null     |    Info     |
|   3  |       0      |    Page1    |
|   4  |       0      |    Page2    |
|   5  |       0      |    Page3    | ...
|   6  |       1      |    Image1   |
|   7  |       1      |    Image2   |
|   8  |       2      |      IT     |
|   9  |       8      |    contact  |
|   1  |       8      |     map     |
------------------------------------- ...

when ParentID is null then its highest level
when ParentID is and ID then its a sublevel of whatever level is on that ID and so on...

From C# side,you know the top pages where ParentID's are null.
You can bring sub pages of them by selected ID's of top pages.It's some ADO.NET work.

Hope this helps
Myra

Myra
Many thanks, you nudged me in the right direction.
Darknight
A: 

Once you have create multiple sitemaps using your logic you can merge them using sitemap tool. Just load work (which is nothing but loading existing sitemaps to merge)

A: 

ok, did it:

First created a class:

 public class Node
 {
    private string _Parent = string.Empty;
    private string _Child = string.Empty;
    private bool   _IsRoot = false;

    public string Parent
    {
        set { _Parent = value; }
        get { return _Parent; }
    }

    public string Child
    {
        set { _Child = value; }
        get { return _Child; }
    }

    public Node(string PChild, string PParent)
    {
        _Parent = PParent;
        _Child = PChild;
    }

    public bool IsRoot
    {
        set { _IsRoot = value; }
        get { return _IsRoot; }
    }
 }

then generated the SiteMap, by transforming the urls strings directly as follows:

   private static string MakeTree()
    {
        List<Node> __myTree = new List<Node>();

        List<string> urlRecords = new List<string>();
        urlRecords.Add("home/image1");
        urlRecords.Add("home/image2");
        urlRecords.Add("IT/contact/map");
        urlRecords.Add("IT/contact/address");
        urlRecords.Add("IT/jobs");

        __myTree = ExtractNode(urlRecords);

        List<string> __roots = new List<string>();

        foreach(Node itm in __myTree)
        {
            if (itm.IsRoot)
            {
                __roots.Add(itm.Child.ToString());
            }
        }

        string __trees = string.Empty;

        foreach (string roots in __roots)
        {
            __trees += GetChildren(roots, __myTree) + "<hr/>";
        }


        return __trees;
    }

    private static string GetChildren(string PRoot, List<Node> PList)
    {
        string __res = string.Empty;
        int __Idx = 0;

        foreach (Node x in PList)
        {
            if (x.Parent.Equals(PRoot))
            {
                __Idx += 1;
            }
        }

        if (__Idx > 0)
        {
            string RootHeader = string.Empty;

            foreach (Node x in PList)
            {
                if (x.IsRoot & PRoot == x.Child)
                {
                    RootHeader = x.Child;
                }
            }

            __res += RootHeader+ "<ul>\n";

            foreach (Node itm in PList)
            {
                if (itm.Parent.Equals(PRoot))
                {
                    __res += string.Format("<ul><li>{0}{1}</li></ul>\n", itm.Child, GetChildren(itm.Child, PList));
                }
            }
            __res += "</ul>\n";
            return __res;
        }
        return string.Empty;
    }

    private static List<Node> ExtractNode(List<string> Urls)
    {
        List<Node> __NodeList = new List<Node>();

        foreach (string itm in Urls)
        {
            string[] __arr = itm.Split('/');
            int __idx = -1;

            foreach (string node in __arr)
            {
                __idx += 1;
                if (__idx == 0)
                {
                    Node __node = new Node(node, "");
                    if (!__NodeList.Exists(x => x.Child == __node.Child & x.Parent == __node.Parent))
                    {
                        __node.IsRoot = true;
                        __NodeList.Add(__node);    
                    }
                }
                else
                {
                    Node __node = new Node(node, __arr[__idx - 1].ToString());
                    {
                        if (!__NodeList.Exists (x => x.Child == __node.Child & x.Parent == __node.Parent))
                        {
                            __NodeList.Add(__node);
                        }
                    }
                }
            }
        }
        return __NodeList;
    }

anyway it's not optimised, I'm sure I can clean it up a lot..

Darknight