Hi NuPIC,
I'm trying to run a swarm on my vagrant box with a pip install nupic. I'm
using a password for my mysql server so I changed the password that nupic
uses in the config file at this location:
/home/vagrant/.local/lib/python2.7/site-packages/nupic-0.3.4-py2.7.egg/nupic/support/nupic-default.xml.
The connection with the mysql server is OK.
Right now I am getting the following error:
vagrant@vagrant-ubuntu-trusty-64:~/resources/nupic/scripts$ python
run_swarm.py ../examples/swarm/simple/search_def.json --maxWorkers=8
--overwrite
Generating experiment files in directory:
/home/vagrant/resources/nupic/examples/swarm/simple...
Writing 313 lines...
Writing 114 lines...
done.
None
Successfully submitted new HyperSearch job, jobID=1016
Evaluated 0 models
HyperSearch finished!
Worker completion message: None
Results from all experiments:
----------------------------------------------------------------
Generating experiment files in directory: /tmp/tmpYqUxss...
Writing 313 lines...
Writing 114 lines...
done.
None
json.loads(jobInfo.results) raised an exception. Here is some info to help
with debugging:
jobInfo: _jobInfoNamedTuple(jobId=1016, client=u'GRP', clientInfo=u'',
clientKey=u'', cmdLine=u'$HYPERSEARCH', params=u'{"hsVersion": "v2",
"maxModels": null, "persistentJobGUID":
"206bb66a-9da6-11e5-8fb8-080027480f3d", "useTerminators": false,
"description": {"includedFields": [{"fieldName": "timestamp", "fieldType":
"datetime"}, {"fieldName": "consumption", "fieldType": "float"}],
"streamDef": {"info": "test", "version": 1, "streams": [{"info":
"hotGym.csv", "source": "file://extra/hotgym/hotgym.csv", "columns": ["*"],
"last_record": 100}], "aggregation": {"seconds": 0, "fields":
[["consumption", "sum"], ["gym", "first"], ["timestamp", "first"]],
"months": 0, "days": 0, "years": 0, "hours": 1, "microseconds": 0, "weeks":
0, "minutes": 0, "milliseconds": 0}}, "inferenceType": "MultiStep",
"inferenceArgs": {"predictionSteps": [1], "predictedField": "consumption"},
"iterationCount": -1, "swarmSize": "medium"}}', jobHash="
lY0\x9d\xa6\x11\xe5\x8f\xb8\x08\x00'H\x0f=", status=u'notStarted',
completionReason=None, completionMsg=None,
workerCompletionReason=u'success', workerCompletionMsg=None, cancel=0,
startTime=None, endTime=None, results=None, engJobType=u'hypersearch',
minimumWorkers=1, maximumWorkers=8, priority=0, engAllocateNewWorkers=1,
engUntendedDeadWorkers=0, numFailedWorkers=0,
lastFailedWorkerErrorMsg=None, engCleaningStatus=u'notdone',
genBaseDescription=None, genPermutations=None,
engLastUpdateTime=datetime.datetime(2015, 12, 8, 12, 20, 53),
engCjmConnId=None, engWorkerState=None, engStatus=None,
engModelMilestones=None)
jobInfo.results: None
EXCEPTION: expected string or buffer
Traceback (most recent call last):
File "run_swarm.py", line 187, in <module>
runPermutations(sys.argv[1:])
File "run_swarm.py", line 178, in runPermutations
fileArgPath, optionsDict, outputLabel, permWorkDir)
File
"/home/vagrant/.local/lib/python2.7/site-packages/nupic-0.3.4-py2.7.egg/nupic/swarming/permutations_runner.py",
line 310, in runWithJsonFile
verbosity=verbosity)
File
"/home/vagrant/.local/lib/python2.7/site-packages/nupic-0.3.4-py2.7.egg/nupic/swarming/permutations_runner.py",
line 277, in runWithConfig
return _runAction(runOptions)
File
"/home/vagrant/.local/lib/python2.7/site-packages/nupic-0.3.4-py2.7.egg/nupic/swarming/permutations_runner.py",
line 218, in _runAction
returnValue = _runHyperSearch(runOptions)
File
"/home/vagrant/.local/lib/python2.7/site-packages/nupic-0.3.4-py2.7.egg/nupic/swarming/permutations_runner.py",
line 161, in _runHyperSearch
metricsKeys=search.getDiscoveredMetricsKeys())
File
"/home/vagrant/.local/lib/python2.7/site-packages/nupic-0.3.4-py2.7.egg/nupic/swarming/permutations_runner.py",
line 826, in generateReport
results = json.loads(jobInfo.results)
File
"/home/vagrant/.local/lib/python2.7/site-packages/nupic-0.3.4-py2.7.egg/nupic/swarming/object_json.py",
line 163, in loads
json.loads(s, object_hook=objectDecoderHook, **kwargs))
File "/usr/lib/python2.7/json/__init__.py", line 351, in loads
return cls(encoding=encoding, **kw).decode(s)
File "/usr/lib/python2.7/json/decoder.py", line 366, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
TypeError: expected string or buffer
I have looked at the issues listed on GitHub and from what I can tell this
is supposed to have been fixed when the dependency on the NUPIC environment
variable was removed.
I suspect the output contains some crucial information on the error, but I
do not know how to interpret it. There are a number of fields set to
'none', but I don't know how they relate to each other. Any help?
with regards,
Casper Rooker
[email protected]