Hi
I have been working on an implementation of the linear time algorithm in
the research paper. The algorithm requires a lot of computations involving
formation of induced graphs. Corresponding to this I made some design
decisions which I needed to get reviewed.
1. I do not store the induced graph entirely in memory. For an induced
graph I create a set and a list which contains all the vertices of the
induced graph. The edges corresponding to vertices in the induced graph is
not stored and need to be used from the original graph itself.
- The advantage of this approach is it saves memory and computations
required to create the induced graph. This would be useful for large
graphs
where number of edges can be of order O(n^2).
- The demerit is that if I have to traverse through the edges of a
vertex in the induced graph it would require me to traverse all the edges
of that vertex in the original graph itself.
- The set and list can be combined into a single data structure using
C++ set implementation as it allows to iterate through the elements of
the
set. I have currently implemented everything using C.
2. For set implementation I am using bits to represent whether a node is
present in the graph or not. This reduces the memory required by a factor
of 32. But this approach assumes that the nodes are numbered from 0 or 1
and that too in a contiguous manner.
I am currently implementing the code required for recursive factorization.
I needed some review on these design decisions and changes I need to make
for better performance.
Thanks
Lokesh
On Tuesday, March 7, 2017 at 11:03:12 PM UTC+5:30, Dima Pasechnik wrote:
>
>
>
> On Tuesday, March 7, 2017 at 4:26:52 PM UTC, Lokesh Jain wrote:
>>
>> Hi
>>
>> I am M.Sc.(Hons.) Mathematics and B.E.(Hons.) Computer Science student at
>> BITS Pilani, Pilani Campus. I am currently interning with the Apache Hadoop
>> YARN team in Hortonworks. I am very interested in Graph Theory and look
>> forward to implement the Modular Decomposition of Graphs algorithm as a
>> GSOC project. I have found a research paper "Simpler Linear-Time Modular
>> Decomposition via Recursive Factorizing Permutations
>> <http://www.google.com/url?q=http%3A%2F%2Fwww.cs.toronto.edu%2F~mtedder%2FTedderModular.pdf&sa=D&sntz=1&usg=AFQjCNHs-Rq8h6w_TDvK0THTze8wUZCgEQ>"
>>
>> corresponding to that. It defines a practical linear time algorithm for
>> modular decomposition of graphs. I wanted to ask whether I should go ahead
>> with this research paper and try to implement it?
>>
>
> yes, this is one that definitely should implemented (in C or Python). The
> only implementation I know is in Java,
> and it's probably quite old too.
>
>
>
>>
>> Regards
>> Lokesh
>>
>
--
You received this message because you are subscribed to the Google Groups
"sage-gsoc" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/sage-gsoc.
For more options, visit https://groups.google.com/d/optout.