Wyatt,
I am now at a loss as to why the test tab stopped showing up on the .NET Core
tests - I have changed the script back to the state it was last working in,
changed the glob paths, but still no luck. I could really use some help with
this.
The strange thing is that it is still working fine on .NET Framework. Since
there was no way that the NUnit build feature could have worked with the paths
that were configured there, I have come to the conclusion that this has nothing
to do with the failure and that it probably shouldn't even be enabled. The
original location of the XML files was in the when you set this up was in the
root of each project directory. It seemed to stop working about the time I
changed it to put them under /release/<framework-version>/<project
name>/TestResult.xml. I first tried adding a second location to output the XML,
and then tried moving it back to where it was originally, to no avail.
There is one major difference between the tests for the 2 frameworks - .NET
Core uses the dotnet.exe command line runner and .NET Framework uses the NUnit
command line runner. However, since that fact has not changed in the past
couple of days I am not sure why the tests are not showing up in TeamCity.
I suppose I could try targeting the last commit that was known to work to see
if it still works to determine if it has anything at all to do with the changes
I have made or if it is something that happened to the environment (like
installing a new version of .NET Core SDK or changing a TeamCity setting).
Anyway, I have migrated all of the functionality into the PSake script now. The
build is basically done except for the .NET Core tests not showing up and
adding the steps to pack and push.
Here is how I am thinking we need to setup the build:
Lucene.Net Base Build (does nothing but
produce the version)
/
\
.NET 4.5.1 (version > compile > test)
DotNetCore 1.0 (version > compile > test)
\
/
Lucene.Net Package (version > build > pack >
push to MyGet)
Let me know if that works, or if we need to adjust it - right now the
dependencies are automatically enforced by PSake, so if you run Test it will
automatically version, compile, and test (and it works similarly if you run
Pack). We could alternatively do Pack in the first step and then push the
binaries through, but we would need to remove the dependent task from Test if
we did it that way.
Thanks,
Shad Storhaug (NightOwl888)
-----Original Message-----
From: Shad Storhaug [mailto:[email protected]]
Sent: Wednesday, April 12, 2017 1:23 AM
To: Wyatt Barnett
Cc: Connie Yau; [email protected]
Subject: RE: API Work/Stabilization Update
It looks like there was yet another issue causing all of the tests to run -
according to the TeamCity docs, a "Configuration Parameter" is not passed into
the build. It isn't very clear what their purpose would be if you can't use it
as a build parameter, but I have swapped them for environment variables -
hopefully that now fixes the issue with running all of the tests instead of
just the target framework.
Also, I noticed the glob pattern wasn't quite right for the XML files and have
updated them.
I have queued another attempt to see if this resolves the issues.
-----Original Message-----
From: Shad Storhaug [mailto:[email protected]]
Sent: Tuesday, April 11, 2017 11:45 PM
To: Wyatt Barnett
Cc: Connie Yau; [email protected]
Subject: RE: API Work/Stabilization Update
Wyatt,
I am familiar with TeamCity’s approach to versioning – and it isn’t right (or
at least isn’t thorough).
1. AssemblyVersion – this should only increment on a major version. So,
this should always be set to 4.0.0.0 in our case. This is how you get around
the issues with strong naming (at least that is how Microsoft did it on MVC). I
know that Itamar doesn’t want to strong name, but by using versioning the same
way as if we were strong-named, we won’t be backed into a corner if it turns
out that people demand it later. If we increment this to 4.8.0.XXX now, it will
be impossible to set it back to 4.0.0.0 later. This number is not visible on
the outside of the assembly, and its number only really matters if the assembly
is strong-named.
2. FileVersion – this is where we increment the version on the assembly –
it is visible from the properties on the outside of the assembly, but it
doesn’t support pre-release tag.
3. InformationalVersion – this accepts any old string, so we put our
entire version number here, including any pre-release tag (since the other 2
attributes don’t support pre-release tags). Also visible on the outside of the
assembly.
Last I tried the patch feature in TeamCity, it doesn’t update the versions
correctly (but I don’t recall exactly what it did).
We have an additional hiccup here – when building from the CLI, there doesn’t
appear to be a way to read the InformationalVersion in .NET Core. I made an
attempt to make a custom attribute to do that, but it didn’t work either. Now I
am thinking maybe hard-coding LUCENE_VERSION to a string and having the build
script update the Constants.cs file using a RegEx as well – not only will that
be more reliable across runtime environments, it doesn’t use Reflection at
runtime so it will be faster, too.
Rethinking the versioning. There is one limitation about using a single
incrementing number in TeamCity that is a bit troublesome. Assuming from this
point forward we “upgrade” to the next Java version rather than port from
scratch again, we would typically want to reset the counter to 0. But if all
TeamCity has is an incrementing number that would mean it would no longer have
unique version numbers. So, we make build 1 all over again when we go from
4.8.0.1 to 4.8.1.1, for example. The whole point of setting up the numbers for
versioning is to make them unique across all application builds – but we would
have collisions within TeamCity if we use a single incrementing number rather
than putting the whole version string in TeamCity (unless we never reset the
number to 0, but then we have a scenario like a Y2K bug that will eventually
blow up in the distant future when we run out of numbers). Not to mention, it
would be much clearer looking at the TeamCity log if the whole version number
is there. Thoughts?
I finally got the .NET Core build from the command line not to crash on the
Thai tests. For some reason, it seems to be completely ignoring the conditional
compilation symbol on Lucene.Net.Tests.Analysis.Common. So, I ended up manually
failing 2 of the tests in both .NET Core and .NET Framework and it appears to
have done the trick. It looks like the test fails nearly 100% of the time from
the command line, so hopefully this can be narrowed down to a non-random test
that I can submit to icu-dotnet, which will get the ball rolling on a fix.
I just noticed that there was another issue (that I initiated) by using
“parameters” instead of “properties” in the Powrshell script on TeamCity, that
I have now resolved. Using the former causes the tests to run for both
frameworks (because the script uses PSake properties, not Powershell parameters
and it wasn’t overwriting the default value) – so we should have our first
visible .NET Core test run in TeamCity in another hour.
As for the test results files, it makes more sense to separate them if run from
the command line manually. I think the problems we are having on the server
were mostly due to the properties thing. I have moved them back to the original
location (in the release\TestResults\ folder), so now your TestResults glob
pattern should pick them up (either with or without the framework in the
string, since each test is on a clean path anyway).
Thanks for putting this together – now it is much clearer what you had in mind.
Now, if you could work on getting some of the automation in place so each phase
automatically triggers when the last phase ends, I can work on the script and
fixing remaining build problems. BTW – would it be better to merge api-work to
master sooner or later? Is there anything we need to shut down first?
Thanks,
Shad Storhaug (NightOwl888)
From: Wyatt Barnett [mailto:[email protected]]
Sent: Tuesday, April 11, 2017 9:58 PM
To: Shad Storhaug
Cc: Connie Yau; [email protected]
Subject: Re: API Work/Stabilization Update
Hi Shad -- Just checking back in here -- looks like you are starting to walk it
over the target. A few notes from what I can see here:
* Looking at the testing stuff I think it was behaving correctly before the
changes to make multiple copies if the xml files based on version, walking that
back might make sense.
* For the assembly version number -- just leave the AssemblyInfo.cs files,
teamcity can replace that file version on the fly.
Let me know if you need further assistance.
On Mon, Apr 10, 2017 at 8:48 PM Wyatt Barnett
<[email protected]<mailto:[email protected]>> wrote:
I just cleaned up the builds a lot -- there are now folders effectively. Pay
attention to the builds under
https://teamcity.jetbrains.com/project.html?projectId=LuceneNet_PortableBuilds&tab=projectOverview
-- that is what I'm thinking. There is a base "build" project that does a
basic build to make sure we didn't check in something fundmamentally broken and
sets the build number. Following that 0-n test projects can pick that up and
run it. Finally, we can add a package project depending on the test project to
build packages and finally an upload project to upload at the end of the day
presuming all is successful.
I am waiting on the build server to work out the whole "no compatible agents"
thing to see if I got things setup correctly -- I suspect this will work as
this is just breaking up your build steps with a little glue.
Overall I think the build script should have the bulk of the smarts. Build
servers do best when left as fancy yet dumb orchestration and notification
automatons rather than doing a lot of the thinking. I agree the version should
be kept in the repo to a large extent. One approach to versioning would be to
set it up as a global variable in the ps scripte -- something like
$CURRENT_VERSION_PATTERN="4.8.0.{0}" and $IS_BETA flag with a function to
GetCurrentVersion() that works out the math for internal consumption. The
extrenal API (parameters) could take an option build number parameter, and an
optional is beta flag and we could handle the version string like that.
I'll check back in a bit to see if the builds fired off and we can take it from
there.
On Mon, Apr 10, 2017 at 4:38 PM Shad Storhaug
<[email protected]<mailto:[email protected]>> wrote:
> Having me take a stab at it sounds good, when would you be expecting
> turnaround?
ASAP. I want to try to get this beta out as quickly as we can. I really need to
wrap this up and find some paid work! But if you aren’t available today, I will
give it a shot – I was just hoping to take a little of this off my plate. If
you do have time for this today/tonight (wherever you are), let me know – there
are plenty of other tasks to work on.
I have the script pretty much all transferred to PSake – it cut the amount of
code almost in half. It looks like there already is a NuGet Publish task built
into TeamCity, so I don’t see much point to keeping it in the script.
Ok, I will change the parameters of the script to utilize the build number and
still allow the entire string to be passed in another parameter. It seems like
control of the versioning is ultimately something that should be in the
repository where people without TeamCity access can modify it. Only downside is
that the “version” displayed in TeamCity will only be the incrementing number
instead of what is on the NuGet packages.
From: Wyatt Barnett
[mailto:[email protected]<mailto:[email protected]>]
Sent: Tuesday, April 11, 2017 2:46 AM
To: Shad Storhaug
Cc: Connie Yau; [email protected]<mailto:[email protected]>
Subject: Re: API Work/Stabilization Update
Sorry for the confusion. Having me take a stab at it sounds good, when would
you be expecting turnaround?
Totally agree on one true, consistent version stream. I would hit it slightly
differently though -- I'd let TC supply the build number and then calculate the
rest of the string and not depend on parsing in the script. It is a bit easier
and cleaner to me but that might just be me.
On Mon, Apr 10, 2017 at 3:13 PM Shad Storhaug
<[email protected]<mailto:[email protected]>> wrote:
Ok, now I am confused again ☺. But since you said the scripts are separated
enough now to set it up, could you set up the initial framework of the builds
and dependencies so they all work from the same commit? It should be pretty
straightforward to figure out what the expectations are and build the script
around them after that step is done.
It would also help if I understood how you want to setup the versioning. I
can’t imagine a simpler way to do it than having the build server specify the
entire version string and having the script work out how to chop it up and put
it where it needs to, but maybe there is a better way that I haven’t considered.
We certainly want to keep the same version scheme between MyGet and NuGet so we
aren’t locked into “we can’t fix this release because we can’t change the
version” like what happened in the past. It really doesn’t make much difference
if we promote to NuGet from TeamCity or from MyGet (although if we use MyGet we
don’t have to configure anything extra to do it).
From: Wyatt Barnett
[mailto:[email protected]<mailto:[email protected]>]
Sent: Tuesday, April 11, 2017 1:39 AM
To: Shad Storhaug
Cc: Connie Yau; [email protected]<mailto:[email protected]>
Subject: Re: API Work/Stabilization Update
Sorry -- I knew I left something off. Yes -- we would have a final "package"
step that depends on all the other steps in the chain passing for it to
complete and push to myget. Note we can pass artifacts down the chain so the
preceding steps could do most of the build work for the final one to push to
myget. Sorry if I didn't make that clear.
On Mon, Apr 10, 2017 at 2:34 PM Shad Storhaug
<[email protected]<mailto:[email protected]>> wrote:
Wyatt,
That helps.
But one thing to note is that the end product is a single NuGet package with
both .NET 451 and .NET Core 1.0 DLLs in it. That kind of breaks the idea of
having 2 different builds – unless we have a 3rd build that somehow depends on
the other two that pushes the NuGet package out to MyGet. Or set one of the
builds up to fail the other if it fails. Any ideas?
Thanks,
Shad Storhaug (NightOwl888)
From: Wyatt Barnett
[mailto:[email protected]<mailto:[email protected]>]
Sent: Tuesday, April 11, 2017 1:26 AM
To: Shad Storhaug
Cc: Connie Yau; [email protected]<mailto:[email protected]>
Subject: Re: API Work/Stabilization Update
Hi Shad -- yeah it looks like a lot of progress. Still under a rock at the day
job.
To answer a question from the previous note -- that "no compatible agents are
available" seems to be a side-effect of how they are using on-demand cloud
build agents. I think it really translates to "nobody is online who is capable
of doing that but we are spinning up a new one, hang tight."
My thought was to have a few separate, chained builds so we would do distinct
build/test runs against each framework. To get there the build script needs to
understand "build and test for net451" and "build and test for netcore1.0"
distinctly. It looks like the components are now there -- you are running those
separately, just in the same build configuration. Because they are separate
builds we should be OK on xml result files overwriting. And they probably can
be parallelized on multiple agents. Does that help demystify things a bit?
On Mon, Apr 10, 2017 at 1:28 PM Shad Storhaug
<[email protected]<mailto:[email protected]>> wrote:
Wyatt,
I have managed to get the test failures under control and started working on a
PSake script, the first task being "test". I have also set it up so NUnit will
output TeamCity service messages, and made separate TeamCity tasks for each
framework.
However, instrumentation really hasn't improved much from when it all ran in
one task (at least the last test I ran without the service messages). So, I am
not sure exactly what you had in mind to make it easy to spot which framework
caused the build failure. If you want to make some edits to the build
configuration to show me what you had in mind (or just explain your ideas) that
would be great.
One thing I could do (if it helps) is to give the XML files different names per
framework or put them into different folders per framework. Right now we just
have one framework overwriting the XML files from the last one.
Thanks,
Shad Storhaug (NightOwl888)
-----Original Message-----
From: Shad Storhaug [mailto:[email protected]<mailto:[email protected]>]
Sent: Saturday, April 8, 2017 11:18 PM
To: Wyatt Barnett
Cc: Connie Yau; [email protected]<mailto:[email protected]>
Subject: RE: API Work/Stabilization Update
I finally got a build to run all the way through on TeamCity a couple of days
ago, and now I am trying to fix tests that are failing on the command line that
didn't fail in Visual Studio. A lot of them were failing because the embedded
resources didn't end up in the same place when using the CLI. So I added a set
of extension methods that do an aggressive match by trying various combinations
of assembly name/partial namespace, partial namespace only, and partial
assembly name only. It seems Micrsoft still hasn't worked out all of the bugs
with how embedded resources behave, and this is the only way to "fix it once".
Anyway, I am trying to start another build to see how many tests are left to
fix, but it seems that there are no longer any connected compatible agents that
can run it. Wyatt, can you check into this please?
Thanks,
Shad Storhaug (NightOwl888)
-----Original Message-----
From: Shad Storhaug [mailto:[email protected]<mailto:[email protected]>]
Sent: Friday, April 7, 2017 3:06 AM
To: Wyatt Barnett
Cc: Connie Yau; [email protected]<mailto:[email protected]>
Subject: RE: API Work/Stabilization Update
Well, PSake runs on top of Powershell – it just makes Powershell act like
MSBuild in that you can have multiple tasks that can be run individually and
can be dependent on other tasks. But since we got most of the process setup in
PowerShell already it would be simpler to transfer the existing functionality
if we stick with PowerShell.
FYI – I modified the README page already
(https://github.com/apache/lucenenet/tree/api-work). If anyone else has
something to add or wants to make a change I will accept requests/pull requests.
I think we should add a paragraph similar to this one:
https://github.com/snikch/jquery.dirtyforms/blob/master/CONTRIBUTING.md#code-contributions
to our contributions page (especially those linked articles, they are a good
read). And of course the status there needs updating.
I ended up manually failing all of the ThaiAnalyzer tests in .NET Core, since
there is no way to catch an AccessViolationException in .NET Core. That seems
to have fixed the stability problems with the test runner (I need to do some
more testing to be sure). But, tomorrow I will start testing with TeamCity to
see if I can get the tests to run all the way through.
From: Wyatt Barnett
[mailto:[email protected]<mailto:[email protected]>]
Sent: Friday, April 7, 2017 1:30 AM
To: Shad Storhaug
Cc: Connie Yau; [email protected]<mailto:[email protected]>
Subject: Re: API Work/Stabilization Update
Good question Shad. I think I de-escaped things properly -- the powershell
command it is running is ./runbuild.ps1 -RunTests -FrameworksToTest @("net451",
"netcoreapp1.0")
I wholeheartedly agree that this build is complicated enough that we should get
something a little beefier than powershell in place. I was looking at FAKE
before but PSake could work too. I can't claim to have significant seat-time
with either but that is why one does these open source projects.
Anyhow more coming later this week/weekend when I can find some time to write
things up . . .
On Thu, Apr 6, 2017 at 11:37 AM Shad Storhaug
<[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>
wrote:
Thanks.
Hmm…Did you play with the escaping of quotes? What I pasted was the raw command
to run from CMD, but that might not be what will run in TeamCity.
I am not entirely certain why it sometimes prompts for that (even when you
don’t specify to run a push), or why there are even required parameters. These
are some of the reasons why I am suggesting that PSake might be better, but
since I don’t know for certain if you can exit the script and re-enter the same
target without having to execute all of the target’s dependencies all over
again, I would like to run a few tests in TeamCity. Worst case, we don’t
specify any dependent targets for “test” so we can run it multiple times
without triggering the restore and build processes over and over (which is
probably harmless, but takes up lots of time). But it helps if you have
separate targets you can call in the script without having to set lots of
different Boolean flags to tell it what to run and what not to.
From: Wyatt Barnett
[mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>]
Sent: Thursday, April 6, 2017 6:20 PM
To: Shad Storhaug
Cc: Connie Yau;
[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>
Subject: Re: API Work/Stabilization Update
Hi Shad -- you now have teamcity permissions. Please be a bit careful in
editing things -- it is a house of cards without great documentation as to why
things are doing what and there isn't an easy rollback button like with source
controlled code. Let me know if you want a tour of what is what and we can find
a time to do some screen sharing.
That command won't run for me on the server -- the error is:
[14:14:38]W: [Step 1/2] C:\BuildAgent\work\dc5a565ec74d072b\runbuild.ps1 :
Cannot process command because of one or more missing mandatory
[14:14:38]W: [Step 1/2] parameters: NuGetApiKey UploadPackages.
[14:14:38]W: [Step 1/2] + CategoryInfo : InvalidArgument: (:)
[runbuild.ps1], ParentContainsErrorRecordException
[14:14:38]W: [Step 1/2] + FullyQualifiedErrorId :
MissingMandatoryParameter,runbuild.ps1
Is there a way to run it without trying to upload the packages -- that is a
fairly mechanical thing we can certainly make work once we get build, test and
packaging the right things going.
I will need a bit more time than I can find on weekdays to respond to your
other notes -- lots of good thoughts in there, they need cogent responses.
On Wed, Apr 5, 2017 at 6:19 PM Shad Storhaug
<[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>
wrote:
Wyatt,
It turns out that adding the 4.5.1 framework to project.json was all that was
required to get the tests to run, but dotnet.exe seems to crash whenever an
exception is thrown from a test on that framework (saw 3 different exception
types - NullReferenceException, AccessViolationException,
IndexOutOfRangeException and they all brought it to a halt). So, I gave the
NUnit build runner a shot and it seems to be more stable (and much faster as
well). I wasn't able to get it working on .NET core, but I haven't yet seen it
crash when running the .NET core tests.
Give it a shot to see if this will run all the way through without crashing in
TeamCity.
powershell -ExecutionPolicy Bypass -Command "& .\runbuild.ps1 -PackageVersion
4.8.0.1000-beta2 -RunTests -FrameworksToTest @(\"net451\", \"netcoreapp1.0\")"
Then we can work on getting this setup so tests can be run as separate steps.
Thanks,
Shad Storhaug (NightOwl888)
-----Original Message-----
From: Shad Storhaug
[mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>]
Sent: Wednesday, April 5, 2017 8:42 PM
To: Wyatt Barnett
Cc: Connie Yau;
[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>
Subject: RE: API Work/Stabilization Update
I found one issue that is preventing the tests from running - the runner isn't
running .NET 4.5.1 framework tests at all. That framework is missing from
project.json in the test projects altogether, so it seems this has never been
run. I tried putting that section in, but I am getting an error that the
"dotnet-test-nunit" target is missing on .NET 4.5.1 (and tracking down the
cause of this doesn't seem to be plausible amongst all of the advice telling to
"upgrade"). The build works, so I don't think it would be wise to upgrade just
for the sake of testing.
So, it seems we are back to square 1 - you are onto something with installing
the NUnit console runner for this. That is also easier said than done - for
some reason both Visual Studio and dotnet restore are ignoring the fact that I
added the NuGet package to the project and are not downloading the binaries. I
guess worst case scenario I can just check the binaries into the repo, but it
would be more difficult to upgrade if we did that, so I am trying to find a fix
to the NuGet download issue first.
The good news is in .NET core I was able to successfully run all tests in the
Lucene.Net.Tests project.
As for splitting this up, I am thinking that starting a script with a true
build runner like PSake is probably a good idea for the long term. Perhaps we
could just take the test part out of the current script so we could set it up
in TeamCity like:
1. Restore/Build step using runbuild.ps1 2. Test (using a different script that
can be called multiple times) 3. Package/Push using runbuild.ps1 (side-effect
of going through the Restore/Build steps all over again)
Also, allow the raw "where" clause to be passed directly to the command line
tool rather than doing all of the building that is done currently. Then the
tests could be setup in different steps however you like. To ensure we can add
projects to the setup without having them ignored by default, I suggest:
1. Test Lucene.Net (50% of the tests)
2. Test Lucene.Net.Analysis.Common (25% of the tests) 3. Test everything but
Lucene.Net and Lucene.Net.Analysis.Common (25% of the tests)
And repeat for each framework.
Or...wait a minute. I just noticed that one of the packages that downloads with
NUnit.Console is a TeamCity event listener. Would it make more sense just to
skip a test script for this (at least for .NET 4.5.1) and set it up using
TeamCity tools?
Thanks,
Shad Storhaug (NightOwl888)
-----Original Message-----
From: Shad Storhaug
[mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>]
Sent: Wednesday, April 5, 2017 1:10 PM
To: Wyatt Barnett
Cc: Connie Yau;
[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>
Subject: RE: API Work/Stabilization Update
Wyatt,
I have been getting that randomly, too (but I thought it was mainly due to
Visual Studio locking the file). Re-running has always made it recover here.
Basically, I have set it up so it backs up the project.json files before making
edits so it can restore them at the end. This is primarily to prevent someone
running locally from accidentally checking in the changes the build makes.
However, if it cannot backup the files for some reason (a permission error
perhaps) it won’t be able to do this step.
So, looks like we need to add an option to override the backup/restore file
process on the build server, which will work around any windows permission
issues that prevent that from working. I will also add a check if the file
exists before attempting to restore to make it more robust.
The modified process does one very important step – it names the main NuGet
package Lucene.Net instead of Lucene.Net.Core. I tried making the script
compensate for the fact that dotnet.exe doesn’t support an option for changing
the package name to be different than the folder name, but that became very
complicated – it was simpler just to rename the folder from Lucene.Net.Core to
Lucene.Net.
As per your suggestion, we should probably break up the restore, build, test,
and pack steps so they work separately. Unfortunately, this script wasn’t built
with a task runner with separate entry points, so that is going to be a bit
complicated. Do we need the part of the script that pushes to NuGet/MyGet, or
is that something you can setup on TeamCity as a separate step without a script?
What specific handling do you need on the version string? I set it up so you
can pass the PackageVersion (of the NuGet package) separate from the Version
(of the assembly). It also sets the PackageVersion as the
AssemblyInformationalVersion of the assembly and also takes care of setting all
of the versions for inter-package dependencies between NuGet packages.
Basically, this makes it simple to specify only 1 parameter for version and the
script will take care of the rest.
Keep in mind, however we set it up that this process is temporary. We will
likely change it up again when Microsoft finally gets .csproj working right.
So, we should focus on the *interface* of the build process (that is, how
TeamCity will interact with the script) so we can change the script around
later without breaking the build. Perhaps we should use a task runner like
PSake so we have separate entry points: https://github.com/psake/psake? I have
always used it in conjunction with PowerShell in the past (although I have to
admit, I never needed any of the other entry points – but then, I never had 2
hours of testing to do). That would eliminate the need to specify a lot of
different switches to turn on and off the right stuff for the current task. i.e
(Invoke-psake .\runbuild.ps1 Test -framework 4.0x64 -properties @{
“framework”=”netcore1.0” }) would just run the tests for .NET Core and do
nothing else (see the Spatial4n script:
https://github.com/NightOwl888/Spatial4n/blob/master/.build/build.ps1).
Build.bat is primarily to make the synatax of building simpler than what is
required to call powershell when building locally. (build –pv:4.8.0.891-beta2
rather than powershell –Command “& .\runbuild.ps1 –CreatePackages
–PackageVersion 4.8.0.891-beta2”). It could be expanded with more parameters
and/or environment variables if necessary – I find it simpler to maintain and
test if there is both a way to manually build and automatically build that run
through the same script (the manual part should be as simple as possible, the
automated part only as complex as it needs to be for the build server to do its
magic).
I tried running all of the tests last night from the command line and it
crashed during the run. Fortunately, I have a stack trace that might help track
it down.
Culture: vai-Latn-LR
Time Zone: (UTC-05:00) Bogota, Lima, Quito, Rio Branco Default Codec: Lucene41
(Lucene.Net.Codecs.Lucene41.Lucene41RWCodec)
Default Similarity: RandomSimilarityProvider(queryNorm=False,coord=yes):
=> Lucene.Net.Search.TestConstantScoreQuery.TestQueryWrapperFilter
Passed
Culture: vai-Latn-LR
Time Zone: (UTC-05:00) Bogota, Lima, Quito, Rio Branco Default Codec: Lucene41
(Lucene.Net.Codecs.Lucene41.Lucene41RWCodec)
Default Similarity: RandomSimilarityProvider(queryNorm=False,coord=yes):
[field, DFR G3(800)] =>
Lucene.Net.Search.TestConstantScoreQuery.TestWrapped2Times
Passed
Culture: he-IL
Time Zone: (UTC-04:00) Asuncion
Default Codec: Lucene3x (Lucene.Net.Codecs.Lucene3x.PreFlexRWCodec)
Default Similarity: DefaultSimilarity
: hit exc
System.NullReferenceException: Object reference not set to an instance of an
object.
at Lucene.Net.Util.IOUtils.ReThrow(Exception th) in
F:\Projects\lucenenet\src\Lucene.Net\Util\IOUtils.cs:line 413
at Lucene.Net.Index.StoredFieldsProcessor.Flush(SegmentWriteState state) in
F:\Projects\lucenenet\src\Lucene.Net\Index\StoredFieldsProcessor.cs:line 91
at Lucene.Net.Index.TwoStoredFieldsConsumers.Flush(SegmentWriteState state)
in F:\Projects\lucenenet\src\Lucene.Net\Index\TwoStoredFieldsConsumers.cs:line
47
at Lucene.Net.Index.DocFieldProcessor.Flush(SegmentWriteState state) in
F:\Projects\lucenenet\src\Lucene.Net\Index\DocFieldProcessor.cs:line 84
at Lucene.Net.Index.DocumentsWriterPerThread.Flush() in
F:\Projects\lucenenet\src\Lucene.Net\Index\DocumentsWriterPerThread.cs:line 573
at Lucene.Net.Index.DocumentsWriter.DoFlush(DocumentsWriterPerThread
flushingDWPT) in
F:\Projects\lucenenet\src\Lucene.Net\Index\DocumentsWriter.cs:line 638
at Lucene.Net.Index.DocumentsWriter.PreUpdate() in
F:\Projects\lucenenet\src\Lucene.Net\Index\DocumentsWriter.cs:line 461
at Lucene.Net.Index.DocumentsWriter.UpdateDocuments(IEnumerable`1 docs,
Analyzer analyzer, Term delTerm) in
F:\Projects\lucenenet\src\Lucene.Net\Index\DocumentsWriter.cs:line 513
at Lucene.Net.Index.IndexWriter.UpdateDocuments(Term delTerm, IEnumerable`1
docs, Analyzer analyzer) in
F:\Projects\lucenenet\src\Lucene.Net\Index\IndexWriter.cs:line 1562
at Lucene.Net.Index.TrackingIndexWriter.UpdateDocuments(Term t,
IEnumerable`1 docs) in
F:\Projects\lucenenet\src\Lucene.Net\Index\TrackingIndexWriter.cs:line 103
at Lucene.Net.Search.TestControlledRealTimeReopenThread.UpdateDocuments(Term
id, IEnumerable`1 docs) in
F:\Projects\lucenenet\src\Lucene.Net.Tests\Search\TestControlledRealTimeReopenThread.cs:line
116
at
Lucene.Net.Index.ThreadedIndexingAndSearchingTestCase.ThreadAnonymousInnerClassHelper.Run()
in
F:\Projects\lucenenet\src\Lucene.Net.TestFramework\Index\ThreadedIndexingAndSearchingTestCase.cs:line
276
at Lucene.Net.Util.IOUtils.ReThrow(Exception th) in
F:\Projects\lucenenet\src\Lucene.Net\Util\IOUtils.cs:line 413
at Lucene.Net.Index.StoredFieldsProcessor.Flush(SegmentWriteState state) in
F:\Projects\lucenenet\src\Lucene.Net\Index\StoredFieldsProcessor.cs:line 91
at Lucene.Net.Index.TwoStoredFieldsConsumers.Flush(SegmentWriteState state)
in F:\Projects\lucenenet\src\Lucene.Net\Index\TwoStoredFieldsConsumers.cs:line
47
at Lucene.Net.Index.DocFieldProcessor.Flush(SegmentWriteState state) in
F:\Projects\lucenenet\src\Lucene.Net\Index\DocFieldProcessor.cs:line 84
index=_0(4.2):C10/1:delGen=1 _1(4.2):C10 _2(4.2):C10 _3(4.2):C5 _4(4.2):C1IW
8 [5/4/2017 01:48:05; ]: merging _1(4.2):C10 _2(4.2):C10 _0(4.2):C10/1:delGen=1
_3(4.2):C5 _4(4.2):C1IW 8 [5/4/2017 01:48:05; ]: seg=_1(4.2):C10 no deletesIW 8
[5/4/2017 01:48:05; ]: seg=_2(4.2):C10 no deletesIW 8 [5/4/2017 01:48:05; ]:
seg=_0(4.2):C10/1:delGen=1 delCount=1IW 8 [5/4/2017 01:48:05; ]: seg=_3(4.2):C5
no deletesIW 8 [5/4/2017 01:48:05; ]: seg=_4(4.2):C1 no deletesSM 8 [5/4/2017
01:48:05; ]: merge store matchedCount=5 vs 5SM 8 [5/4/2017 01:48:05; ]: 0 msec
to merge stored fields [35 docs]SM 8 [5/4/2017 01:48:05; ]: 3 msec to merge
postings [35 docs]SM 8 [5/4/2017 01:48:05; ]: 5 msec to merge doc values [35
docs]SM 8 [5/4/2017 01:48:05; ]: 0 msec to merge norms [35 docs]SM 8 [5/4/2017
01:48:05; ]: 1 msec to merge vectors [35 docs]IW 8 [5/4/2017 01:48:05; ]: merge
codec=Lucene46: [[trieLong, PostingsFormat(name=Memory doPackFST= False)],
[content2, PostingsFormat(name=Memory doPackFST= True)], [id,
PostingsFormat(name=Memory doPackFST= True)], [content5, FST41], [autf8,
PostingsFormat(name=Direct)], [trieInt, PostingsFormat(name=Memory doPackFST=
False)], [utf8, PostingsFormat(name=Memory doPackFST= False)], [fieΓ▒╖ld,
FST41], [content3, PostingsFormat(name=Direct)], [content,
PostingsFormat(name=Memory doPackFST= True)], [content6,
PostingsFormat(name=Direct)]], docValues:[[dvSortedSet,
DocValuesFormat(name=Disk)], [dvPacked, DocValuesFormat(name=Disk)],
[dvBytesStraightVar, DocValuesFormat(name=Asserting)], [dvLong,
DocValuesFormat(name=Disk)], [dvInt, DocValuesFormat(name=Memory)],
[dvBytesDerefFixed, DocValuesFormat(name=Memory)], [dvFloat,
DocValuesFormat(name=Asserting)], [dvBytesSortedFixed,
DocValuesFormat(name=SimpleText)], [dvBytesStraightFixed,
DocValuesFormat(name=Memory)], [dvBytesSortedVar, DocValuesFormat(name=Disk)],
[dvDouble, DocValuesFormat(name=Disk)], [dvBytesDerefVar,
DocValuesFormat(name=Memory)], [dvByte, DocValuesFormat(name=Disk)], [dvShort,
DocValuesFormat(name=Disk)]] docCount=35; merged segment has vectors; norms;
docValues; prox; freqsIW 8 [5/4/2017 01:48:05; ]: merged segment size=%.3f MB
vs estimate=%.3f MBTP 8 [5/4/2017 01:48:05; ]: startCommitMergeIW 8 [5/4/2017
01:48:05; ]: commitMerge: _1(4.2):C10 _2(4.2):C10 _0(4.2):C10/1:delGen=1
_3(4.2):C5 _4(4.2):C1 index=_0(4.2):C10/1:delGen=1 _1(4.2):C10 _2(4.2):C10
_3(4.2):C5 _4(4.2):C1TP 8 [5/4/2017 01:48:05; ]: startCommitMergeDeletesIW 8
[5/4/2017 01:48:05; ]: commitMergeDeletes _1(4.2):C10 _2(4.2):C10
_0(4.2):C10/1:delGen=1 _3(4.2):C5 _4(4.2):C1IW 8 [5/4/2017 01:48:05; ]: no new
deletes or field updates since merge startedIFD 8 [5/4/2017 01:48:05; ]: now
checkpoint "_5(4.8):C35" [1 segments ; isCommit = False]IFD 8 [5/4/2017
01:48:05; ]: 0 msec to checkpointIW 8 [5/4/2017 01:48:05; ]: after commitMerge:
_5(4.8):C35UPGMP 8 [5/4/2017 01:48:05; ]: findForcedMerges:
segmentsToUpgrade=System.Collections.Generic.Dictionary`2[Lucene.Net.Index.SegmentCommitInfo,System.Nullable`1[System.Boolean]]IW
8 [5/4/2017 01:48:05; ]: merge time 0 msec for 35 docsIndexUpgrader 8
[5/4/2017 01:48:05; ]: All segments upgraded to version 4.8CMS 8 [5/4/2017
01:48:05; ]: merge thread: doneIW 8 [5/4/2017 01:48:05; ]: now flush at close
waitForMerges=TrueTP 8 [5/4/2017 01:48:05; ]: startDoFlushIW 8 [5/4/2017
01:48:05; ]: start flush: applyAllDeletes=TrueIW 8 [5/4/2017 01:48:05; ]:
index before flush _5(4.8):C35DW 8 [5/4/2017 01:48:05; ]: startFullFlushDW 8
[5/4/2017 01:48:05; ]: anyChanges? numDocsInRam=0 deletes=False
hasTickets:False pendingChangesInFullFlush: FalseDW 8 [5/4/2017 01:48:05; ]:
finishFullFlush success=TrueIW 8 [5/4/2017 01:48:05; ]: apply all deletes
during flushBD 8 [5/4/2017 01:48:05; ]: applyDeletes: no deletes; skippingBD 8
[5/4/2017 01:48:05; ]: prune sis=Lucene.Net.Index.SegmentInfos minGen=0
packetCount=0CMS 8 [5/4/2017 01:48:05; ]: now mergeCMS 8 [5/4/2017 01:48:05; ]:
index: _5(4.8):C35CMS 8 [5/4/2017 01:48:05; ]: no more merges pending; now
returnIW 8 [5/4/2017 01:48:05; ]: waitForMergesIW 8 [5/4/2017 01:48:05; ]:
waitForMerges doneIW 8 [5/4/2017 01:48:05; ]: now call final commit()IW 8
[5/4/2017 01:48:05; ]: commit: startIW 8 [5/4/2017 01:48:05; ]: commit: enter
lockIW 8 [5/4/2017 01:48:05; ]: commit: now prepareIW 8 [5/4/2017 01:48:05; ]:
prepareCommit: flushIW 8 [5/4/2017 01:48:05; ]: index before flush
_5(4.8):C35TP 8 [5/4/2017 01:48:05; ]: startDoFlushDW 8 [5/4/2017 01:48:05; ]:
startFullFlushDW 8 [5/4/2017 01:48:05; ]: anyChanges? numDocsInRam=0
deletes=False hasTickets:False pendingChangesInFullFlush: FalseIW 8 [5/4/2017
01:48:05; ]: apply all deletes during flushBD 8 [5/4/2017 01:48:05; ]:
applyDeletes: no deletes; skippingBD 8 [5/4/2017 01:48:05; ]: prune
sis=Lucene.Net.Index.SegmentInfos minGen=0 packetCount=0DW 8 [5/4/2017
01:48:05; ]: finishFullFlush success=TrueTP 8 [5/4/2017 01:48:05; ]:
startStartCommitIW 8 [5/4/2017 01:48:05; ]: StartCommit(): startIW 8 [5/4/2017
01:48:05; ]: startCommit index=_5(4.8):C35 changeCount=2TP 8 [5/4/2017
01:48:05; ]: midStartCommitTP 8 [5/4/2017 01:48:05; ]: midStartCommit2IW 8
[5/4/2017 01:48:05; ]: done all syncs: _5.fdx, _5.fdt, _5_Direct_0.doc,
_5_Direct_0.pos, _5_Direct_0.pay, _5_Direct_0.tim, _5_Direct_0.tip,
_5_Memory_0.ram, _5_FST41_0.doc, _5_FST41_0.pos, _5_FST41_0.pay,
_5_FST41_0.tmp, _5_Memory_1.ram, _5_Disk_0.dvdd, _5_Disk_0.dvdm,
_5_Memory_0.mdvd, _5_Memory_0.mdvm, _5_SimpleText_0.dat, _5_Asserting_0.dvd,
_5_Asserting_0.dvm, _5.nvd, _5.nvm, _5.tvx, _5.tvd, _5.fnm, _5.siTP 8 [5/4/2017
01:48:05; ]: midStartCommitSuccessTP 8 [5/4/2017 01:48:05; ]:
finishStartCommitIW 8 [5/4/2017 01:48:05; ]: commit: pendingCommit != nullIW 8
[5/4/2017 01:48:05; ]: commit: wrote segments file "segments_4"IFD 8 [5/4/2017
01:48:05; ]: now checkpoint "_5(4.8):C35" [1 segments ; isCommit = True]IFD 8
[5/4/2017 01:48:05; ]: deleteCommits: now decRef commit "segments_3"IFD 8
[5/4/2017 01:48:05; ]: delete "segments_3"IFD 8 [5/4/2017 01:48:05; ]: delete
"_0.fnm"IFD 8 [5/4/2017 01:48:05; ]: delete "_0_Lucene42_0.dvd"IFD 8 [5/4/2017
01:48:05; ]: delete "_0_Lucene41_0.pos"IFD 8 [5/4/2017 01:48:05; ]: delete
"_0.tvd"IFD 8 [5/4/2017 01:48:05; ]: delete "_0_Lucene42_0.dvm"IFD 8 [5/4/2017
01:48:05; ]: delete "_0.nvm"IFD 8 [5/4/2017 01:48:05; ]: delete
"_0_Lucene41_0.pay"IFD 8 [5/4/2017 01:48:05; ]: delete "_0.tvx"IFD 8 [5/4/2017
01:48:05; ]: delete "_0_Lucene41_0.doc"IFD 8 [5/4/2017 01:48:05; ]: delete
"_0.nvd"IFD 8 [5/4/2017 01:48:05; ]: delete "_0.fdx"IFD 8 [5/4/2017 01:48:05;
]: delete "_0.si<http://0.si><http://0.si>"IFD 8 [5/4/2017 01:48:05; ]: delete
"_0_Lucene41_0.tim"IFD 8 [5/4/2017 01:48:05; ]: delete "_0.fdt"IFD 8 [5/4/2017
01:48:05; ]: delete "_0_Lucene41_0.tip"IFD 8 [5/4/2017 01:48:05; ]: delete
"_0_1.del"IFD 8 [5/4/2017 01:48:05; ]: delete "_1.tvx"IFD 8 [5/4/2017 01:48:05;
]: delete "_1.nvm"IFD 8 [5/4/2017 01:48:05; ]: delete "_1_Lucene41_0.doc"IFD 8
[5/4/2017 01:48:05; ]: delete "_1_Lucene42_0.dvm"IFD 8 [5/4/2017 01:48:05; ]:
delete "_1_Lucene41_0.tim"IFD 8 [5/4/2017 01:48:05; ]: delete "_1.nvd"IFD 8
[5/4/2017 01:48:05; ]: delete "_1_Lucene41_0.tip"IFD 8 [5/4/2017 01:48:05; ]:
delete "_1_Lucene42_0.dvd"IFD 8 [5/4/2017 01:48:05; ]: delete "_1.fnm"IFD 8
[5/4/2017 01:48:05; ]: delete "_1_Lucene41_0.pos"IFD 8 [5/4/2017 01:48:05; ]:
delete "_1.fdx"IFD 8 [5/4/2017 01:48:05; ]: delete "_1_Lucene41_0.pay"IFD 8
[5/4/2017 01:48:05; ]: delete "_1.fdt"IFD 8 [5/4/2017 01:48:05; ]: delete
"_1.si<http://1.si><http://1.si>"IFD 8 [5/4/2017 01:48:05; ]: delete
"_1.tvd"IFD 8 [5/4/2017 01:48:05; ]: delete
"_2.si<http://2.si><http://2.si>"IFD 8 [5/4/2017 01:48:05; ]: delete
"_2_Lucene41_0.pos"IFD 8 [5/4/2017 01:48:05; ]: delete "_2_Lucene41_0.tim"IFD 8
[5/4/2017 01:48:05; ]: delete "_2.fdt"IFD 8 [5/4/2017 01:48:05; ]: delete
"_2_Lucene41_0.doc"IFD 8 [5/4/2017 01:48:05; ]: delete "_2_Lucene41_0.tip"IFD 8
[5/4/2017 01:48:05; ]: delete "_2.fdx"IFD 8 [5/4/2017 01:48:05; ]: delete
"_2.tvx"IFD 8 [5/4/2017 01:48:05; ]: delete "_2.fnm"IFD 8 [5/4/2017 01:48:05;
]: delete "_2.tvd"IFD 8 [5/4/2017 01:48:05; ]: delete "_2.nvm"IFD 8 [5/4/2017
01:48:05; ]: delete "_2_Lucene42_0.dvm"IFD 8 [5/4/2017 01:48:05; ]: delete
"_2_Lucene41_0.pay"IFD 8 [5/4/2017 01:48:05; ]: delete "_2.nvd"IFD 8 [5/4/2017
01:48:05; ]: delete "_2_Lucene42_0.dvd"IFD 8 [5/4/2017 01:48:05; ]: delete
"_3.tvd"IFD 8 [5/4/2017 01:48:05; ]: delete "_3_Lucene41_0.pay"IFD 8 [5/4/2017
01:48:05; ]: delete "_3.fdt"IFD 8 [5/4/2017 01:48:05; ]: delete "_3.fnm"IFD 8
[5/4/2017 01:48:05; ]: delete "_3.fdx"IFD 8 [5/4/2017 01:48:05; ]: delete
"_3.tvx"IFD 8 [5/4/2017 01:48:05; ]: delete "_3.nvd"IFD 8 [5/4/2017 01:48:05;
]: delete "_3_Lucene41_0.pos"IFD 8 [5/4/2017 01:48:05; ]: delete
"_3_Lucene41_0.doc"IFD 8 [5/4/2017 01:48:05; ]: delete "_3_Lucene41_0.tip"IFD 8
[5/4/2017 01:48:05; ]: delete "_3_Lucene42_0.dvm"IFD 8 [5/4/2017 01:48:05; ]:
delete "_3.si<http://3.si><http://3.si>"IFD 8 [5/4/2017 01:48:05; ]: delete
"_3_Lucene41_0.tim"IFD 8 [5/4/2017 01:48:05; ]: delete "_3.nvm"IFD 8 [5/4/2017
01:48:05; ]: delete "_3_Lucene42_0.dvd"IFD 8 [5/4/2017 01:48:05; ]: delete
"_4.fdx"IFD 8 [5/4/2017 01:48:05; ]: delete "_4.nvd"IFD 8 [5/4/2017 01:48:05;
]: delete "_4_Lucene41_0.doc"IFD 8 [5/4/2017 01:48:05; ]: delete "_4.fnm"IFD 8
[5/4/2017 01:48:05; ]: delete "_4.si<http://4.si><http://4.si>"IFD 8 [5/4/2017
01:48:05; ]: delete "_4.fdt"IFD 8 [5/4/2017 01:48:05; ]: delete
"_4_Lucene41_0.tip"IFD 8 [5/4/2017 01:48:05; ]: delete "_4.nvm"IFD 8 [5/4/2017
01:48:05; ]: delete "_4_Lucene41_0.tim"IFD 8 [5/4/2017 01:48:05; ]: 0 msec to
checkpointIW 8 [5/4/2017 01:48:05; ]: commit: doneIW 8 [5/4/2017 01:48:05; ]:
at close: _5(4.8):C35
at Lucene.Net.Index.DocumentsWriterPerThread.Flush() in
F:\Projects\lucenenet\src\Lucene.Net\Index\DocumentsWriterPerThread.cs:line 573
at Lucene.Net.Index.DocumentsWriter.DoFlush(DocumentsWriterPerThread
flushingDWPT) in
F:\Projects\lucenenet\src\Lucene.Net\Index\DocumentsWriter.cs:line 638
at Lucene.Net.Index.DocumentsWriter.PreUpdate() in
F:\Projects\lucenenet\src\Lucene.Net\Index\DocumentsWriter.cs:line 461
at Lucene.Net.Index.DocumentsWriter.UpdateDocuments(IEnumerable`1 docs,
Analyzer analyzer, Term delTerm) in
F:\Projects\lucenenet\src\Lucene.Net\Index\DocumentsWriter.cs:line 513
at Lucene.Net.Index.IndexWriter.UpdateDocuments(Term delTerm, IEnumerable`1
docs, Analyzer analyzer) in
F:\Projects\lucenenet\src\Lucene.Net\Index\IndexWriter.cs:line 1562
at Lucene.Net.Index.TrackingIndexWriter.UpdateDocuments(Term t,
IEnumerable`1 docs) in
F:\Projects\lucenenet\src\Lucene.Net\Index\TrackingIndexWriter.cs:line 103
at Lucene.Net.Search.TestControlledRealTimeReopenThread.UpdateDocuments(Term
id, IEnumerable`1 docs) in
F:\Projects\lucenenet\src\Lucene.Net.Tests\Search\TestControlledRealTimeReopenThread.cs:line
116 Unhandled Exception: System.Exception: System.NullReferenceException:
Object reference not set to an instance of an object.
at Lucene.Net.Util.IOUtils.ReThrow(Exception th) in
F:\Projects\lucenenet\src\Lucene.Net\Util\IOUtils.cs:line 413
at Lucene.Net.Index.StoredFieldsProcessor.Flush(SegmentWriteState state) in
F:\Projects\lucenenet\src\Lucene.Net\Index\StoredFieldsProcessor.cs:line 91
at Lucene.Net.Index.TwoStoredFieldsConsumers.Flush(SegmentWriteState state)
in F:\Projects\lucenenet\src\Lucene.Net\Index\TwoStoredFieldsConsumers.cs:line
47
at Lucene.Net.Index.DocFieldProcessor.Flush(SegmentWriteState state) in
F:\Projects\lucenenet\src\Lucene.Net\Index\DocFieldProcessor.cs:line 84
at Lucene.Net.Index.DocumentsWriterPerThread.Flush() in
F:\Projects\lucenenet\src\Lucene.Net\Index\DocumentsWriterPerThread.cs:line 573
at Lucene.Net.Index.DocumentsWriter.DoFlush(DocumentsWriterPerThread
flushingDWPT) in
F:\Projects\lucenenet\src\Lucene.Net\Index\DocumentsWriter.cs:line 638
at Lucene.Net.Index.DocumentsWriter.PreUpdate() in
F:\Projects\lucenenet\src\Lucene.Net\Index\DocumentsWriter.cs:line 461
at Lucene.Net.Index.DocumentsWriter.UpdateDocuments(IEnumerable`1 docs,
Analyzer analyzer, Term delTerm) in
F:\Projects\lucenenet\src\Lucene.Net\Index\DocumentsWriter.cs:line 513
at Lucene.Net.Index.IndexWriter.UpdateDocuments(Term delTerm, IEnumerable`1
docs, Analyzer analyzer) in
F:\Projects\lucenenet\src\Lucene.Net\Index\IndexWriter.cs:line 1562
at Lucene.Net.Index.TrackingIndexWriter.UpdateDocuments(Term t,
IEnumerable`1 docs) in
F:\Projects\lucenenet\src\Lucene.Net\Index\TrackingIndexWriter.cs:line 103
at Lucene.Net.Search.TestControlledRealTimeReopenThread.UpdateDocuments(Term
id, IEnumerable`1 docs) in
F:\Projects\lucenenet\src\Lucene.Net.Tests\Search\TestControlledRealTimeReopenThread.cs:line
116
at
Lucene.Net.Index.ThreadedIndexingAndSearchingTestCase.ThreadAnonymousInnerClassHelper.Run()
in
F:\Projects\lucenenet\src\Lucene.Net.TestFramework\Index\ThreadedIndexingAndSearchingTestCase.cs:line
276 ---> System.NullReferenceException: Object reference not set to an
instance of an object.
at Lucene.Net.Util.IOUtils.ReThrow(Exception th)
at Lucene.Net.Index.StoredFieldsProcessor.Flush(SegmentWriteState state)
at Lucene.Net.Index.TwoStoredFieldsConsumers.Flush(SegmentWriteState state)
at Lucene.Net.Index.DocFieldProcessor.Flush(SegmentWriteState state)
at Lucene.Net.Index.DocumentsWriterPerThread.Flush()
at Lucene.Net.Index.DocumentsWriter.DoFlush(DocumentsWriterPerThread
flushingDWPT)
at Lucene.Net.Index.DocumentsWriter.PreUpdate()
at Lucene.Net.Index.DocumentsWriter.UpdateDocuments(IEnumerable`1 docs,
Analyzer analyzer, Term delTerm)
at Lucene.Net.Index.IndexWriter.UpdateDocuments(Term delTerm, IEnumerable`1
docs, Analyzer analyzer)
at Lucene.Net.Index.TrackingIndexWriter.UpdateDocuments(Term t,
IEnumerable`1 docs)
at Lucene.Net.Search.TestControlledRealTimeReopenThread.UpdateDocuments(Term
id, IEnumerable`1 docs)
at
Lucene.Net.Index.ThreadedIndexingAndSearchingTestCase.ThreadAnonymousInnerClassHelper.Run()
--- End of inner exception stack trace ---
at
Lucene.Net.Index.ThreadedIndexingAndSearchingTestCase.ThreadAnonymousInnerClassHelper.Run()
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext,
ContextCallback callback, Object state)
at
Lucene.Net.Index.ThreadedIndexingAndSearchingTestCase.ThreadAnonymousInnerClassHelper.Run()
in
F:\Projects\lucenenet\src\Lucene.Net.TestFramework\Index\ThreadedIndexingAndSearchingTestCase.cs:line
276
WARNING: Could not find TestResult.xml.
Testing [Lucene.Net.Tests.Analysis.Common] on [net451]...
dotnet.exe test --configuration Release --framework net451 --no-build Project
'F:\Projects\lucenenet\src\Lucene.Net.Tests.Analysis.Common\project.json' does
not support framework: net451
WARNING: Could not find TestResult.xml.
Testing [Lucene.Net.Tests.Analysis.Common] on [netcoreapp1.0]...
dotnet.exe test --configuration Release --framework netcoreapp1.0 --no-build
NUnit .NET Core Runner 3.4.0 Copyright (C) 2016 Charlie Poole Runtime
Environment
OS Platform: Windows
OS Version: 10.0.14393
Runtime: win10-x64
Test Files
F:\Projects\lucenenet\src\Lucene.Net.Tests.Analysis.Common\bin\Release\netcoreapp1.0\Lucene.Net.Tests.Analysis.Common.dll
Terminate batch job (Y/N)? y
And yea, I see that it is having trouble finding TestResult.xml files, too.
Perhaps that move step also needs to be made optional or taken out entirely.
Thanks,
Shad Storhaug (NightOwl888)
From: Wyatt Barnett
[mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>]
Sent: Wednesday, April 5, 2017 4:43 AM
To: Shad Storhaug
Cc: Connie Yau;
[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>
Subject: Re: API Work/Stabilization Update
Update -- build was building and testing as of yesterday evening, it looks like
some of the build.ps1 changes blew up whatever was working there, the build is
now failing because it can't find some project.json.bak file . . .
On Tue, Apr 4, 2017 at 4:41 PM Wyatt Barnett
<[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>>
wrote:
Hi Shad -- sorry I have been trying to get the whole thing working on TeamCity
for the last few days. Unfortunately test runs take upwards of an hour and I
have to remember to get back to it to check so progress is slow. In any case
I'm actually very close to getting the fundamentals worked out -- see
https://teamcity.jetbrains.com/viewType.html?buildTypeId=LuceneNet_Vs2015LuceneNetPortable
for the specific build project. At this point I can get it to build and run
all the tests. The problem is reading the test results -- build.ps1 makes the
nunit XML files but it moves them and for some reason TeamCity didn't like
that. Is there any reason it was taking that copy step connie?
Project.json vs Project.csproj is pretty immaterial to me.
Anyhow, it does appear the build server has the pre-requisites required to do
so. As for teamcity access I would create an account at
https://teamcity.jetbrains.com and then let me know what the user / email is
and I'll see if I can get them to patch you in.
FWIW build.bat probably doesn't help much -- we need a bit more elegant
handling of the version string, and probably want to run separate tests for
each target so we know what is failing.
On Tue, Apr 4, 2017 at 1:31 PM Shad Storhaug
<[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>>
wrote:
Update
I have made it past the hurdles with the build, and got the build working on
MyGet. But since MyGet has a build timeout of 15 minutes, there is no way to
run the tests there.
So, now we need to get it working on TeamCity. AFAIK I don't have access to it,
so if someone could either help me out or provide access that would be great.
project.json files are NOT the new way (as I previously thought) - Microsoft
has already scrapped this idea. I guess back when Connie was working on it
things were still up in the air:
http://fizzylogic.nl/2016/05/11/project-json-is-going-away-and-it-s-a-good-thing/
- and actually they still are. So, we might have to keep it this way for now
and then change it around at some later point when Microsoft officially
announces that multi-targeting support is released, then we can go back to a
single solution and single project file per project.
We need to have the .NET Core SDK installed on the build server. Although it is
possible to download the binaries, they are more than 100MB when unpacked. So,
here is the list or prerequisites for the build server.
PowerShell version 3.0 or higher
Git for Windows
.NET Core 1.1 with SDK Preview 2.1 build 3177
(https://github.com/dotnet/core/blob/master/release-notes/download-archive.md)
To run the build, execute "build.bat".
Environment Variables:
%PackageVersion% - Entire version string including pre-release tag
%RunAllTests% - Pass "true" to bypass the 4 parameters that you would otherwise
have to set on runbuild.ps1 in order to run all of the tests for both .NET
Framework and .NET Core
Thanks,
Shad Storhaug (NightOwl888)
-----Original Message-----
From: Shad Storhaug
[mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>]
Sent: Sunday, April 2, 2017 4:10 PM
To:
[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>
Cc:
[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>;
Connie Yau
Subject: RE: API Work/Stabilization Update
Update
I tried running the tests with verbosity on and NUnit is misbehaving in ways I
didn't previously realize. There is apparently some resource leak that is
causing a buildup over several tests and then once it reaches a certain
threshold all of the tests will fail with an OutOfMemoryException. For now, I
am going to consider fixing verbosity low priority - instead we should disable
it for the time being and focus on the build/test runs.
Looking through the build script there are few things of note:
1. It skips long running tests and tests that are marked with a timeout by
default 2. It runs in Release by default 3. Connie has listed Visual Studio
2015 Update 3 as a prerequisite
Clearly, this isn't what we want. We shouldn't be skipping any tests on the
build server except for those that have been manually ignored. Verbosity is
being ignored in the build script by default, which explains why the verbosity
issue is not causing the script to crash.
I am also trying to work out if there is a way to run without Visual Studio.
The script depends on dotnet.exe, which (I think) is installed by the .NET Core
SDK. I am attempting to find out if we really need the SDK or if the executable
is all we need. Either way, I don't see this leaving the ground without having
.NET Framework 4.5.1 and (possibly) .NET Core 1.1 installed on the server.
I guess this leaves a few questions:
1. Is it safe to assume the .NET Framework 4.5.1 and .NET Core 1.1 runtimes
will be installed on the server?
2. Can we install the .NET Core SDK on the server, or is our only option to
find out if the tools we need are portable (I can't imagine they are not, but
you never know)?
Thanks,
Shad Storhaug (NightOwl888)
-----Original Message-----
From: Wyatt Barnett
[mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>]
Sent: Saturday, April 1, 2017 7:52 AM
To:
[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>
Subject: Re: API Work/Stabilization Update
I wanted to walk back to one thing you asked about the tests Shad:
"Worst case, we can just turn off verbosity on the build, but it might be
helpful to leave it on if something goes wrong"
I *think* I'm turning the verbosity off in the tests -- they should be running
in release mode. They would not even run in debug if I recall correctly. Let me
know how the switch works and I'll make sure I'm throwing it on the build
server side.
On Fri, Mar 31, 2017 at 6:56 PM Shad Storhaug
<[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>>
wrote:
> I don't necessarily need ownership access, but he did give me
> ownership to the https://www.nuget.org/packages/Spatial4n.Core/ package.
> Alternatively, Itamar can enter his keys on
> https://www.myget.org/feed/Security/spatial4n - he already has ownership.
> Once the keys are added there, I will be able to push.
>
> -----Original Message-----
> From: Prescott Nasser
> [mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>]
> Sent: Saturday, April 1, 2017 5:49 AM
> To:
> [email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>
> Subject: RE: API Work/Stabilization Update
>
> Access like ownership access? Paging Itamar..
>
>
>
> -----Original Message-----
> From: Shad Storhaug
> [mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>]
> Sent: Friday, March 31, 2017 3:38 PM
> To:
> [email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>
> Subject: RE: API Work/Stabilization Update
>
> BTW - the contrib NuGet package is now dead - the functionality has
> been moved into new sub-projects just like in Lucene.
>
> But, that reminds me - I still don't have access to
> https://www.nuget.org/packages/Spatial4n.Core.NTS/, so I have been
> unable to update it to the same version that
> https://www.nuget.org/packages/Spatial4n.Core/ is on. NTS is not
> required by Lucene.Net, but it is required for anyone who needs to run the
> tests.
>
> -----Original Message-----
> From: Shad Storhaug
> [mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>]
> Sent: Saturday, April 1, 2017 5:31 AM
> To:
> [email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>
> Subject: RE: API Work/Stabilization Update
>
> Wyatt,
>
> Great. Yea, actually after I sent my last email I was starting to
> wonder if Connie ever tested the build on a box without Visual Studio
> installed.
> We might need to determine what the prerequisites for the build are so
> we know what to put on the build server.
>
> I haven't yet had a chance to determine what tests are causing the
> build to crash (although, I know I can reliably reproduce it). I plan
> on getting that straightened out shortly. Worst case, we can just turn
> off verbosity on the build, but it might be helpful to leave it on if
> something goes wrong. I'll also look into the build dependencies and the
> script itself.
>
> I am just going through now and deleting all of the old 3.x source
> files that are now just causing noise in the project and resurrecting
> the tests for the support classes. But I am nearly finished. If you
> are available, I would like to try to get the beta2 release done over
> the weekend (at least to the point where it is on MyGet). We need to
> get the README and CONTRIBUTING files updated with the latest status
> (it would be helpful to sync the wiki page as well:
> https://cwiki.apache.org/confluence/display/LUCENENET/Current+Status -
> I don't think I have access for that).
>
> Several issues regarding the main readme need to be addressed:
>
> 1. There is no link to the wiki
> 2. There is no link to the issue tracker 3. It is not clear what the
> status is 4. There are no build instructions 5. There is no mention
> that we need help to finish 6. There is no link to the license 7. The
> documentation links are out of date 8. The top part of the repo
> already lists the files, do we really need to describe them again?
>
> Itamar mentioned he would be working on a new web site - I am not sure
> what the status of that is, but either way these issues are causing
> friction with anyone who is willing to help, but can't navigate
> through these hurdles. People who are used to working with GitHub, its
> issue tracker, and wiki don't find it natural to go looking somewhere
> else to these tools. For beta testing, we definitely need to make it
> crystal clear how to report bugs.
>
> Also, the "known issues" list is getting short enough so I can start
> adding them to the issue tracker.
>
> I have been using the downtime while running tests to fix up the
> documentation comments in Lucene.Net.Core, but it's still only about
> 50% done. I could use some help getting them fixed so they are at
> least visible in Visual Studio intellisense. See the latest status on
> #203. And then of course we can use them to generate new documents.
>
>
> Thanks,
> Shad Storhaug (NightOwl888)
>
>
> -----Original Message-----
> From: Wyatt Barnett
> [mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>]
> Sent: Saturday, April 1, 2017 4:22 AM
> To:
> [email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>
> Subject: Re: API Work/Stabilization Update
>
> Shad -- definitely makes sense.
>
> Json files are fine -- functionally this is a bit too fancy to use
> Teamcity's automagic nuget package generation so as long as we've got
> a file to edit we are fine.
>
> Myget -> nuget works for me but that doesn't solve the key problem. I
> don't have it, maybe Prescott or Itamar know where it is kept but I
> can't claim to have ever seen it. I joined this party after the last nuget
> push.
>
> It is a bit foggy but I think I ran into the nunit-console issue with
> the
> Build.ps1 script. Remember that with build servers the pre-requisites
> often need to be embedded in the project for things to work properly.
>
> Anyhow, let me know when you are in a good place with your branch to
> start slogging through getting the new build working. In the interests
> of full disclosure I'm working an event the last week and a half of
> April and will be completely out of pocket then. But I'm about otherwise.
>
> On Sat, Mar 25, 2017 at 10:04 AM Shad Storhaug
> <[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>>
> wrote:
>
> > Wyatt,
> >
> > > We will probably want to build out .nuspec files to get all the
> > > nuget
> > stuff right for these projects -- I don't think the generation will
> > work for us to get things quite right.
> >
> > Connie has set us up to use .json files instead of .nuspec files to
> > generate the NuGet packages (the new way instead of the old way).
> > The build script Build.ps1 does it all (it even has help
> > documentation), but it is missing an option to override versioning.
> > Ideally we would be able to override the version that is in the
> > .json file with an environment variable (which you can pass from
> > TeamCity), and be able to override that on the CLI for local
> > "one-off" builds. See the build
> instructions on #191:
> > https://github.com/apache/lucenenet/pull/191
> >
> > > Regarding the nuget key -- that plan works for me, the trick is I
> > > don't
> > have the key to add to myget.
> >
> > I don't know what order the infrastructure was setup in on your end,
> > but my thought was that if someone had previously pushed from MyGet
> > to NuGet the key is probably already configured there. But yea, you
> > would need access to MyGet to confirm.
> >
> > > I would love to start beating on that a bit but the .net core
> > > version
> > seems to want NUnit 3.5+ which needs to be added to the project to run.
> >
> > I will take a look at your pull request, but I think this is a
> > symptom of trying to run using the older tooling. The Build.ps1
> > script already has the ability to test, and all of the tooling is
> > there to do it (I think - maybe I should do a fresh clone to be
> > sure). It does have some prerequisites, though (see #191). It builds
> > both the .NET Framework and .NET Core versions and packages them into NuGet.
> >
> > Per #191: Hopefully Lucene.Net.sln can be removed in the future
> > because the .NET Core projects compile for .NET 4.5.1 already.
> >
> > So I think the aim is to eventually eliminate those .csproj files
> > (and for that matter .nuspec files) and use strictly .json files for
> > project configuration going forward.
> >
> >
> > Thanks,
> > Shad Storhaug (NightOwl888)
> >
> > -----Original Message-----
> > From: Wyatt Barnett
> > [mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>]
> > Sent: Saturday, March 25, 2017 5:00 AM
> > To:
> > [email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>
> > Subject: Re: API Work/Stabilization Update
> >
> > Shad -- the overall plan sounds good. We will probably want to build
> > out .nuspec files to get all the nuget stuff right for these
> > projects
> > -- I don't think the generation will work for us to get things quite
> right.
> >
> > Regarding the nuget key -- that plan works for me, the trick is I
> > don't have the key to add to myget. Come to think of it I don't
> > think I have the proverbial keys to the myget page either but I
> > think Martin can help us out there.
> >
> > Buffers could be the issue on the tests -- I've long suspected that
> > or I/O causing the meltdown, I just haven't been able to reproduce.
> > I would love to start beating on that a bit but the .net core
> > version seems to want NUnit 3.5+ which needs to be added to the
> > project to run. If you get that added I can start beating on the
> > test problems a
> bit more.
> >
> > Thanks for all your hard work putting this together, let me know how
> > I can help you get it out the proverbial door.
> >
> > On Fri, Mar 24, 2017 at 9:34 AM Shad Storhaug
> > <[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>><mailto:[email protected]<mailto:[email protected]><mailto:[email protected]<mailto:[email protected]>>>>
> > wrote:
> >
> > > Wyatt,
> > >
> > > Thanks. Actually, I was thinking this should go in a few steps
> > > instead of
> > > one:
> > >
> > > 1. Merge #203.
> > > 2. Change the pre-release label to "beta2" and work out any issues
> > > to build/push to MyGet (might take a few tries) 3. Update the
> > > README and CONTRIBUTING pages 4. Push the package to NuGet
> > >
> > > I have always just used the control panel at MyGet to push
> > > upstream to NuGet, and it is capable of storing someone's key so
> > > the person who pushes it doesn't actually need it.
> > >
> > > As far as the tests burning down are concerned, I discovered that
> > > some of them write so much "verbose" data that they overflow
> > > NUnit's buffer and cause it to crash (sometimes this even causes
> > > Visual Studio to crash locally). I think I have found all of the
> > > tests in the Core that were causing this and hard-coded them to
> > > set verbose off (with the ability to manually override), but I
> > > noticed that there are still tests in Analysis.Common that can cause it
> > > to crash.
> > > I haven't investigated if there is a setting in NUnit to increase
> > > the buffer size, which might be a better fix, but I could