[hugin-ptx] Re: assertion ImagesPanel.cpp:143 m_cleaningButton

2010-02-21 Thread Andrew Mihal
Hi,
I figured it out - I was trying to keep two separate svn builds
accessible at the same time and messed up the xrc files. Giving each
sandbox a separate install-prefix solved the problem.

Andrew

On Feb 21, 3:01 pm, Bruno Postle br...@postle.net wrote:
 On Fri 19-Feb-2010 at 23:40 -0800, Andrew Mihal wrote:

  Anyone know a solution for this assertion in the latest svn?

  hugin: /home/mihal/hugin-trunk/src/hugin1/hugin/ImagesPanel.cpp:143: bool
  ImagesPanel::Create(wxWindow*, wxWindowID, const wxPoint, const wxSize,
  long int, const wxString): Assertion `m_cleaningButton' failed.

 I don't see this crash here, this code hasn't changed for months.

 Is this at startup? or when switching to the Images tab?

 --
 Bruno

-- 
You received this message because you are subscribed to the Google Groups 
hugin and other free panoramic software group.
A list of frequently asked questions is available at: 
http://wiki.panotools.org/Hugin_FAQ
To post to this group, send email to hugin-ptx@googlegroups.com
To unsubscribe from this group, send email to 
hugin-ptx+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/hugin-ptx


[hugin-ptx] Re: Fwd: Re: Re: possible memory leak in enblend enfuse?

2009-10-20 Thread Andrew Mihal

Hi Pablo,
Can you elaborate a little on your criticism of snakes?

Is the problem:
- That the polyline formulation cannot describe a good solution at all?
- That the size of the state space in the current implementation is
too restrictive?
- That the current implementation's cost functions do not correlate
with a good solution?
- That the current implementation's GDA solver is insufficiently
powerful to find a good answer to the optimization problem?
- something else?

It seems to me that the best seamline can be modeled as a polyline,
and the search space can be defined in such a way as to include the
best seam as a possibility. I admit that the current search space
definition is a little rough, and that the GDA is buggy and needs
parameter tuning. I think the hard part is defining what is best and
turning that into equations. But surely the graph cut approach has
this same requirement?

Andrew

On Sat, Oct 17, 2009 at 10:18 AM, Pablo d'Angelo pablo.dang...@web.de wrote:

 Hi Andrew,

 Andrew Mihal schrieb:
 Hi,
 I suspect a problem in the vectorization of the seam lines.

 Actually, the approach of using vectorized seam lines is a relatively
 complicated process. Additionally, snakes are not particularly well
 known to find good global solutions. I think a different approach to
 seam line finding would avoid these problems, and also leads to better
 seams. One possibility are the graph cut based segmentation algorithms.
 Here the masks are generated directly and there is no need for going
 from masks to vectors and back to masks. I'm also quite sure that the
 generated seams would be better than with the current snake algorithm.

 ciao
  Pablo

 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
hugin and other free panoramic software group.
A list of frequently asked questions is available at: 
http://wiki.panotools.org/Hugin_FAQ
To post to this group, send email to hugin-ptx@googlegroups.com
To unsubscribe from this group, send email to 
hugin-ptx+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/hugin-ptx
-~--~~~~--~~--~--~---



Re: Fwd: Re: [hugin-ptx] Re: possible memory leak in enblend enfuse?

2009-10-17 Thread Andrew Mihal

Hi,
I suspect a problem in the vectorization of the seam lines. There
is currently no checking that the MaskVectorizeDistance parameter is
suitable for the number of actual pixels on the seam (the points
visited by the CrackContourCirculator). Thus we can construct snakes
that undersample the seam and have only one or two vertices. This is
likely to happen when the distance transform result has many small,
fragmented patches. For each seam we should compute an appropriate
MaskVectorizeDistance heuristically.

The condition that leads to the error message mask is entirely black,
but the white image was not identified as redundant also needs to be
examined. If the seam optimization is allowed to remove snakes
entirely, either because the vectorization decides the contour is too
small or because the annealer collapses the contour, then this
condition should not be an error. The white image does have pixels to
contribute (according to the distance transform), but the optimization
decided that it was not worthwhile to blend them in.

Currently the optimization does not consider the contour area as a
constraint. I think this requires more thought. Should the
optimization be allowed to collapse contours or is this a bug?

Andrew

On Sat, Oct 17, 2009 at 5:23 AM, cspiel csp...@freenet.de wrote:

 Roger -

 On Oct 16, 11:53 am, Rogier Wolff rew-googlegro...@bitwizard.nl
 wrote:
 Most people are not this familiar
 with the code, and simply fire up a GUI. The hugin-0.7.0 gui, I
 suspect simply blended all the images from an exposure stack.

        What do you suggest Enblend should do?
 Should it detect an almost complete overlap
 of a pair of images and report the problem to
 the user?  This would be very helpful in the
 case we discuss here, but lead us to the problem
 of how to identify these pairs of images.
 Furthermore, how can we code this efficiently?


 It is a funny situation that only crashes enblend in weird
 circumstances, but it is still a bug in enblend that is good to have
 fixed. So even though the original test-data is a bit nonsensical, it
 did allow us to find and fix a bug.

        You are totally right.  Enblend must
 behave even with nonsensical data.  Right now
 the Distance Transform routine goes ballistic
 when two images almost completely overlap.  I
 have some ideas, but I'm also curious of your
 suggestions.


 /Chris

 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
hugin and other free panoramic software group.
A list of frequently asked questions is available at: 
http://wiki.panotools.org/Hugin_FAQ
To post to this group, send email to hugin-ptx@googlegroups.com
To unsubscribe from this group, send email to 
hugin-ptx+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/hugin-ptx
-~--~~~~--~~--~--~---



[hugin-ptx] Re: possible memory leak in enblend enfuse?

2009-10-14 Thread Andrew Mihal

Hi,
A correct fix will require determining the cause of the two-point
snake. A two-point polygon has zero area, so it is unclear what region
of the mask this is outlining. Perhaps the mask has isolated
single-pixel spots of black and white? E.g. if the user set the input
alpha masks with spots like this. Maybe the contour iterator does not
handle this case gracefully.

Andrew

On Wed, Oct 14, 2009 at 6:51 AM, Rogier Wolff
rew-googlegro...@bitwizard.nl wrote:


 Hi Chris,


 On Wed, Oct 14, 2009 at 12:55:23AM -0700, cspiel wrote:
 Oh, I just wanted to follow the DRY principle and to heed Roger's
 concerns about the runtime penalty of an if-clause inside the loop
 of all snake segments.  Nothing fancy, really.

 I wasn't concerned about the runtime impications of the if. I simply
 dislike a special case for a situation that should be possible to
 handle in the main loop.

 so strcmp is defined to be implemented as:

        while (*a  (*a == *b)) {
                a++; b++
        }
        return *a-*b;

 No speclial cases, quickly handle end-of-string etc etc. Nice and
 neat.

 Similarly, the case of rotating a list that has only one element on
 one of the ends should not need special casing.

 I was more concerned with the neatness of the code, and the size of
 the code base. If the program grows and grows, it will too quickly
 become unmaintainable.

        Roger.

 --
 ** r.e.wo...@bitwizard.nl ** http://www.BitWizard.nl/ ** +31-15-2600998 **
 **    Delftechpark 26 2628 XH  Delft, The Netherlands. KVK: 27239233    **
 *-- BitWizard writes Linux device drivers for any device you may have! --*
 Q: It doesn't work. A: Look buddy, doesn't work is an ambiguous statement.
 Does it sit on the couch all day? Is it unemployed? Please be specific!
 Define 'it' and what it isn't doing. - Adapted from lxrbot FAQ

 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
hugin and other free panoramic software group.
A list of frequently asked questions is available at: 
http://wiki.panotools.org/Hugin_FAQ
To post to this group, send email to hugin-ptx@googlegroups.com
To unsubscribe from this group, send email to 
hugin-ptx+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/hugin-ptx
-~--~~~~--~~--~--~---



[hugin-ptx] Re: nona-gpu - has anybody got it working?

2009-08-25 Thread Andrew Mihal

Hi,
The error means that the GLSL compiler that is built in to your
video card driver is unhappy with the syntax nona-gpu is giving it.
This is a bug in the video card driver. I checked in a possible
workaround to hugin svn. Please give it another try.

Thanks,
Andrew

On Mon, Aug 24, 2009 at 4:42 PM, Yuval Levygoo...@levy.ch wrote:

 Harry van der Wolf wrote:
 I patched the nona part on OSX to make it compile, and that works. Running
 it with the -g option is failing. See the complete log below. As I'm not a
 programmer I do not really have a clue what I have to change in the code
 itself.

 Someone??

 thanks for the work and for the log, Harry.


 nona: normalization/photometric shader program could not be compiled.
 nona: GL info log:
 ERROR: 0:35: 'array of float' : constructor not supported for type
 ERROR: 0:35: 'array of float' : no matching overloaded function found
 ERROR: 0:35: 'radialVigCorrCoeff' : redefinition

 I'm no expert either, so maybe my hypothesis is completely wrong. I
 would like to exclude a driver issue and the best way to find out is if
 you can set up an Ubuntu partition on your IntelMac and try to run the
 exact same nona project there.

 I had a different error at the same stage. Failed on Windows. Passed on
 Ubuntu. Updated Windows driver and it passed there too. Now on my system
 it fails at a later point, on both Windows and Ubuntu.

 Yuv

 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
hugin and other free panoramic software group.
A list of frequently asked questions is available at: 
http://wiki.panotools.org/Hugin_FAQ
To post to this group, send email to hugin-ptx@googlegroups.com
To unsubscribe from this group, send email to 
hugin-ptx-unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/hugin-ptx
-~--~~~~--~~--~--~---



[hugin-ptx] Re: enblend algorithm

2009-06-23 Thread Andrew Mihal

Hi,
The vertex state space is already restricted to the perpendicular
lines as you suggest. When the initial seam line bends, these state
space lines tend to intersect, which leads to geometrically
uninteresting solutions where the seam line doubles back on itself.
Addressing this issue would be a good start. One of the motivations
for choosing annealing was the ability to encode more complex
optimization criteria into the costfunction, like a penalty for seams
that have self-intersections, or are too straight, or too curvy, or
reduce the enclosed area of a closed contour by too much, etc. The
current implementation is very rudimentary, both in the costfunction
implementation and the annealer itself. I would put more time into it
if I had any to spare.

Andrew

On Thu, Jun 18, 2009 at 1:06 AM,
r.e.wolffr.e.wo...@harddisk-recovery.nl wrote:

 On Jun 14, 8:12 am, Andrew Mihal andrewcmi...@gmail.com wrote:
 The seam line optimization uses a two-step approach influenced by
 research on active contours. The overlap region between an image pair
 is treated as a cost function. Areas of disagreement and areas outside
 the intersection region have high cost. First, the result of the
 nearest feature transform is vectorized into a polyline, and a
 generalized deterministic annealing algorithm is used to adjust the
 vertex positions to optimize the cost of the line. The line is
 penalized for crossing areas of high cost and vertices are penalized
 for moving far from their initial positions in the center of the
 overlap region. Second, Dijkstra's shortest path algorithm is used to
 fill in the exact seam line between polyline vertices.

 Hmm. I worked on real time minimimum cost contour detection in
 '92-'95. Why the annealing, there is an algorithm that will give you
 the
 exact minimum cost.

 I would take the initial seam line. take lines perpendicular to this
 line
 and along this line I'd sample the cost function. A parabolic function
 for distance from the original seam, and some function for the
 difference
 between the two images. Preferably the number of points on those
 perpendicular lines are always the same. This gives a rectangular
 matrix
 of cost points. Now, for every point in the matrix, you have three
 options of getting there from the line above. diagonal from the left
 diagonal from the right, or straight down. If you move down the matrix
 this way, you'll find the minimum cost from top to bottom through the
 matrix, which transforms to a line more or less parallel to the
 initial
 seam.
 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
hugin and other free panoramic software group.
A list of frequently asked questions is available at: 
http://wiki.panotools.org/Hugin_FAQ
To post to this group, send email to hugin-ptx@googlegroups.com
To unsubscribe from this group, send email to 
hugin-ptx-unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/hugin-ptx
-~--~~~~--~~--~--~---



[hugin-ptx] Re: enblend algorithm

2009-06-14 Thread Andrew Mihal

Hi,
The initial seam line generation I wrote is a nearest feature
transform based on voronoi transformation (Breu et al). There is a bug
in my implementation. I have a half-complete repair that is not
checked in yet, and Christoph has a repair in his staging branch. I
intend to replace the algorithm entirely with Danielsson's 4SED. I
also plan to write a version that computes the distance transform on
the surface of a sphere for full-360 images. Lack of time is the
primary factor holding back these projects.

The seam line optimization uses a two-step approach influenced by
research on active contours. The overlap region between an image pair
is treated as a cost function. Areas of disagreement and areas outside
the intersection region have high cost. First, the result of the
nearest feature transform is vectorized into a polyline, and a
generalized deterministic annealing algorithm is used to adjust the
vertex positions to optimize the cost of the line. The line is
penalized for crossing areas of high cost and vertices are penalized
for moving far from their initial positions in the center of the
overlap region. Second, Dijkstra's shortest path algorithm is used to
fill in the exact seam line between polyline vertices.

If you are interested in working on the seam line optimization, some
points to consider are:

- How to handle seams that are closed contours, e.g. the kind of
overlaps that are formed when patching the zenith or nadir of a
full-360 panorama.
- How to reduce the state space of the optimization to maintain
acceptable performance
- How to avoid geometrically uninteresting seam solutions, such as
seam lines that double back on themselves.
- How to encode the abstract notion of seam invisibility as the
optimization goal.
- How to identify image features that should or should not appear in
the final panorama.

I think these are all interesting problems and would be happy to
discuss ideas with you.

Andrew

On Sat, Jun 13, 2009 at 5:42 AM, cspiel csp...@freenet.de wrote:

 Tago -

 On Jun 12, 10:06 am, tago liquidt...@gmail.com wrote:
 Why zero?

        If you are on the border, i.e. at the
 red or green lines then the distance to the
 respective border is zero.


 If I start the line in the middle of the overlap wouldn't
 the horizontal distance between red and green borders exactly half of
 the overlap width ?

        What is the middle?  How would you
 define it?  Does your definition also hold if
 you turn the the example 90 degrees?

 As I said before the problem is manifestly
 two-dimensional.  Just looking at one border is
 not enough.  We must look at _all_ borders at
 the same time to find out what is the minimum
 distance to any of them.


 Is the line generation based on something like the grassfire
 transform ?

        I'm sorry, but I cannot tell you what
 Enblend's original algorithm is based upon,
 because I never understood it.  8-|

 The new algorithm employed in the staging
 branch uses two nearest neighbor transforms
 (NNT) of the differences between the mask files
 A and B.  It constructs the seam mask by
 assigning zeros to pixels closer to the borders
 of A-B and ones to pixels closer to the borders
 of B-A.  The boundary between the zeros and the
 ones makes up the initial seam line.

 You see that neither the computationally
 expensive Grassfire-, nor Watershed-
 transformations are necessary.  The final
 assignment step is a point-wise operation and
 thus of linear complexity in the image size.  It
 parallelizes trivially.

 Latest revisions of staging do compute both
 NNTs in parallel threads, too, given the project
 has been configured with --disable-image-cache
 --enable-openmp and the compiler supports
 OpenMP.


 Are there details on how optimization works ?

        I guess there are lots of them!  It is
 just that I have not looked at the seam-line
 optimization close enough to tell you what is
 going on there.  What about you inspecting it
 and teaching me?


 I guess one needs to
 find a line that does not pass thru mismatches (i.e. parallax or
 moving objects).

        As far as I understand, this is the way
 Enblend works.  It would explain why SmartBlend
 sometimes produces superior output.

 I have an idea of how to improve on the
 optimization, though.  Hopefully, it will take
 us on a par with SmartBlend.  (Kornel will love
 to hear this;)


 HTH,
        Chris

 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
hugin and other free panoramic software group.
A list of frequently asked questions is available at: 
http://wiki.panotools.org/Hugin_FAQ
To post to this group, send email to hugin-ptx@googlegroups.com
To unsubscribe from this group, send email to 
hugin-ptx-unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/hugin-ptx
-~--~~~~--~~--~--~---



[hugin-ptx] Re: which preview?

2009-01-30 Thread Andrew Mihal

Hi Pablo,
Yes, the parameters are compiled in to the shader programs, which
are regenerated for each image in the project. You could pull these
values out as uniform parameters to the GLSL shader program, but I
did not do this. There is a limit on the number of parameters you can
have - I think it is 64.

Andrew

On Fri, Jan 30, 2009 at 2:33 PM, Pablo d'Angelo pablo.dang...@web.de wrote:

 Andrew Mihal schrieb:
 I expect that nona-gpu will not be efficient for rendering the
 previews, due to the overhead of compiling GLSL code for each input
 image and transferring the data back from the GPU to the CPU before
 drawing it in the preview window. The approach used by the GL preview
 is superior in this situation.

 I haven't looked into your GLSL code generation yet, but are the
 parameters (hfov, r,p,y etc.) also compiled into the GLSL code for each
 image?

 This is related to the bug Yuval put in the tracker, which I attached
 a comment to.

 If the frame rate and quality of the GL preview are determined to be
 insufficient, it should be possible to adapt the nona-gpu approach to
 produce vertex shader programs that would offload the mesh
 transformation from the CPU onto the GPU. I apologize for not looking
 at the actual implementation of the GL preview before making this
 suggestion. This might be nonsense.

 The OpenGL preview is quite fast, even on systems without hardware
 acceleration, as the software texture mapping routines seem to be
 reasonably optimized. The main problem is that the GL preview uses a
 forward mapping to generate the mesh, and this complicates matters at
 zenith, nadir and the 360 deg seam.

 ciao
  Pablo

 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
hugin and other free panoramic software group.
A list of frequently asked questions is available at: 
http://wiki.panotools.org/Hugin_FAQ
To post to this group, send email to hugin-ptx@googlegroups.com
To unsubscribe from this group, send email to 
hugin-ptx-unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/hugin-ptx
-~--~~~~--~~--~--~---



[hugin-ptx] Re: segmentation fault when stitching

2009-01-10 Thread Andrew Mihal

Hi,
One thing to add - you might want to use the -x option for Enblend
to turn on checkpointing of partial results. This tells Enblend to
save the results after each blend step, so if it does crash after the
100th image, you aren't left with nothing. This is off by default
because it slows down the process. You can then resume by running
Enblend again using the partial result as an input along with the
unprocessed images. Then if this reproduces the crash, send in a bug
report.

Andrew

On Thu, Jan 8, 2009 at 1:24 AM, Joris Van Dael jov...@telenet.be wrote:

 Hi Seb,

 Everytime I try, it's another image that triggers the crash. The tiny closed 
 contour thing happens with other images too... I'll try again tonight and I 
 bet it will crash again at yet another .tif...

 J.

- Oorspronkelijk bericht -
Van
 : Seb Perez-D [mailto:sbprzd+...@gmail.com]
Verzonden
 : donderdag
 , januari
  8, 2009 08:37 AM
Aan
 : hugin-ptx@googlegroups.com
Onderwerp
 : [hugin-ptx] Re: segmentation fault when stitching

On Thu, Jan 8, 2009 at 08:17, Joris Van Dael jov...@telenet.be wrote:

 500 MB didn't work, so I thought I'd try 2,000 MB.  Alas... Any other
 ideas?Thanks,
 Joris

 Loading next image: architectural0111.tif
 Creating blend mask: 1/4 2/4 3/4 4/4
 Optimizing 2 distinct seams.
 Strategy 1, s0: 1/4 2/4 3/4 4/4
 Strategy 1, s1:

 enblend: Seam s1 is a tiny closed contour and was removed after
 optimization.
 Strategy 2: s0 s1gnumake: *** [architectural.tif] Bus error
 gnumake: *** Deleting file `architectural.tif'


This seems to be a problem with enblend - not hugin. I am surprised by the
Seam s1 is a tiny closed contour which makes me think this is a particular
image. What happens if you remove the 0111 image ?

Cheers,

Seb



 


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
hugin and other free panoramic software group.
A list of frequently asked questions is available at: 
http://wiki.panotools.org/Hugin_FAQ
To post to this group, send email to hugin-ptx@googlegroups.com
To unsubscribe from this group, send email to 
hugin-ptx-unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/hugin-ptx
-~--~~~~--~~--~--~---