On Tuesday, August 16, 2022 at 3:25:26 PM UTC-4 Florian Königstein wrote:

>
>> But I took parts of the old code and built a command line utility named 
>> Hugin2PTOptimizer that converts a Hugin .pto file to a .pto file readable 
>> by PTOptimizer.
>> I have messed a bit around in the code and put all into one file 
>> Hugin2PTOptimize.c. Therefore it's not maintainable, but hopefully it does 
>> what it shall do.
>> I compiled it with VS 2019. If there are problems compiling it with 
>> Linux, let me know.
>>
>> It does not compile in mingw64 and I'm certain would not in Linux, and 
there are far too many problems to fix.

I had started a similar program myself, before you sent that.  My biggest 
problem was (and partially still is) identifying the relationship of tags 
in the .pto file  to fields in pano13 data structures and fields in Hugin 
data structures.  I used y Hugin2PTOptimize.c as an extra source (on top of 
the pto reading and writing code in hugin and pano13) for understanding the 
relationship of fields to tags.

My version reads a .pto file (hugin format or pano13 format) and combines 
the data from a previous run of optimize (if present) into the image data, 
then optionally adds a specified amount of random noise to the optimized 
values, and writes out a new .pto file.

As a tool, this is intended for testing the benefits of changes to 
optimize.  Optimize is an unstable enough process that insignificant 
differences can cause massive result differences (likely including the case 
that led me to start this thread).  So if you make a small change that 
ought to on average cause a small improvement, testing giving a giant 
degradation or giant improvement is more likely than "average" results.  So 
ordinary testing can't tell you whether the coding change doesn't work as 
expected vs. something is flawed in the example vs. basic instability of 
optimize.

My theory and hope is that running a program versions to be tested against 
each optimize problem many times with different random noise added, will 
give results that show meaningful differences between versions.

As a coding exercise, I built this C++ program to
1) Figure out good methods to use headers and structs from the pano13 C 
code and ultimately maintain a compatible C interface while rewriting a lot 
of the operations in C++.  There are problems with those headers and those 
structs for C++, but solutions aren't very difficult.
2) Find a better and more unified way to act generically across fields.  
This is done and/or not done in many different ways across hugin and pano13 
and I did not like any of them.  Use of #include tricks and heave use of 
#define obfuscates that code and makes debugging the code much harder.  At 
the opposite extreme, use of switch statements to split out tags causes 
massive code bloat and related maintenance problems.  I created a better 
method with templated maps and very minimal use of #define.  That will need 
a bit more touch up (as I get a more complete understanding of the whole 
system of tags and fields) before I use it in a C++ rewrite of optimize and 
other parts of pano13.

Like you, I put all the code in one file (except for use of the unmodified 
filter.h and other headers it pulls in).  Once the code is more stable, it 
should be clear how to split to multiple cpp and hpp files.

If anyone want that code, ask and I'll put it somewhere accessible.  I'm 
open to suggestions.  I put a ??? comment in the code in places where I 
didn't know the right way to do something because I don't understand the 
related parts of the pano13 package well enough.


-- 
A list of frequently asked questions is available at: 
http://wiki.panotools.org/Hugin_FAQ
--- 
You received this message because you are subscribed to the Google Groups 
"hugin and other free panoramic software" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to hugin-ptx+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/hugin-ptx/750f3884-b1e8-4f7f-9735-3f8938fb45c8n%40googlegroups.com.

Reply via email to