Hi Santi,

This means that the extractor is working. Your next step is to ensure
that the field names returned by the extractor match the field names
in your Solr schema. Also, please check to ensure that your schema has
all required _yz_* fields.
--
Luke Bakken
Engineer
[email protected]


On Fri, Jan 9, 2015 at 9:38 AM, Santi Kumar <[email protected]> wrote:
> Sorry Luke, I pasted the content in a file and ran the command it worked
> fine. I got the output from extract in json format with all the non null
> fields. Does that mean custom schema is working?
>
> Thanks for timely help and pointing to right resources
>
> Santi
>
> On Fri, Jan 9, 2015 at 10:55 PM, Santi Kumar <[email protected]> wrote:
>>
>> I ran the command as shown below but I was getting wierd errors. In the
>> place of @object.json, I used the actual json content.  Output was
>>
>> curl -XPUT http://localhost:8098/search/extract  -H 'Content-Type:
>> application/json' --data-binary
>> @{"id":null,"tenantId":"eb0a1917-9762-3dd3-a48f-a681d3061212","createdById":null,"createdBy":"SYSTEM","created":1420801439932,"lastUpdatedById":null,"lastUpdatedBy":"SYSTEM","lastUpdated":1420801439932,"policyId":"2d29e759-8e30-499a-8ecc-98eb09eeaa9f","searchId":"557134f2-3589-3fdb-927f-6a71e8098eee","name":"Collaborate","description":"Allow
>> editing of file
>> content","isDefault":true,"enable_edit":true,"disable_auth_on_open":true,"enable_copy_paste":"DISALLOW","enable_printing":false,"enable_offline":true,"offline_timeout":null,"time_bomb_duration":null,"enable_save_as":false,"disable_watermark":true,"disable_spotlight":true,"enable_screen_capture":false,"disable_onetime_use":true}
>>
>>
>> Output is like this
>>
>> <html><head><title>500 Internal Server
>> Error</title></head><body><h1>Internal Server Error</h1>The server
>> encountered an error while processing this request:<br><pre>{error,
>>
>>     {error,
>>
>>         {case_clause,<<"tenantId:eb0a1917-9762-3dd3-a48f-a681d3061212">>},
>>
>>         [{mochijson2,tokenize,2,[{file,"src/mochijson2.erl"},{line,529}]},
>>
>>          {mochijson2,decode1,2,[{file,"src/mochijson2.erl"},{line,312}]},
>>
>>
>> {mochijson2,json_decode,2,[{file,"src/mochijson2.erl"},{line,307}]},
>>
>>          {yz_json_extractor,extract_fields,2,
>>
>>              [{file,"src/yz_json_extractor.erl"},{line,63}]},
>>
>>
>> {yz_wm_extract,extract,2,[{file,"src/yz_wm_extract.erl"},{line,91}]},
>>
>>          {webmachine_resource,resource_call,3,
>>
>>              [{file,"src/webmachine_resource.erl"},{line,186}]},
>>
>>          {webmachine_resource,do,3,
>>
>>              [{file,"src/webmachine_resource.erl"},{line,142}]},
>>
>>          {webmachine_decision_core,resource_call,1,
>>
>>
>> [{file,"src/webmachine_decision_core.erl"},{line,48}]}]}}</pre><P><HR><ADDRESS>mochiweb+webmachine
>> web server</ADDRESS></body></html>curl: (6) Could not resolve host:
>> createdById
>>
>> curl: (6) Could not resolve host: createdBy
>>
>> curl: (3) Port number too large: 1420801439932
>>
>> curl: (6) Could not resolve host: lastUpdatedById
>>
>> curl: (6) Could not resolve host: lastUpdatedBy
>>
>> curl: (3) Port number too large: 1420801439932
>>
>>
>>
>>
>>
>> On Fri, Jan 9, 2015 at 10:29 PM, Luke Bakken <[email protected]> wrote:
>>>
>>> Please run your JSON document through the extractor to ensure that
>>> it's being parsed correctly:
>>>
>>> http://docs.basho.com/riak/latest/dev/advanced/search/#Extractors
>>>
>>> curl -XPUT http://localhost:8098/search/extract \
>>>      -H 'Content-Type: application/json' \
>>>      --data-binary @object.json
>>>
>>> If that works correctly, your schema may not have all of the required
>>> fields:
>>>
>>> http://docs.basho.com/riak/latest/dev/advanced/search-schema/
>>>
>>> http://docs.basho.com/riak/latest/dev/advanced/search-schema/#Custom-Schemas
>>>
>>> --
>>> Luke Bakken
>>> Engineer
>>> [email protected]
>>>
>>>
>>> On Fri, Jan 9, 2015 at 8:55 AM, Santi Kumar <[email protected]> wrote:
>>> > Luke
>>> > It's application/json
>>> >
>>> > Here is the curl command and output dump with content-type in bold
>>
>>
>

_______________________________________________
riak-users mailing list
[email protected]
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to