Thanks. I don't have hive installed in HDP2.6 which spark 2.2.0. I don't
know spark2.3.1 introduces hard dependency on hive and I did not find such
info in spark2.3.1 document.

I will try install hive on my HDP3.0 cluster.

On Thu, Jul 26, 2018 at 12:21 PM, Vitaly Brodetskyi <
vbrodets...@hortonworks.com> wrote:

> Hi Lian Jiang
>
>
>     According to stack trace from SPARK2 service, looks like you don't
> have HIVE service installed on your cluster. Because if HIVE service
> installed, "hive-env" config should be available for sure. As i know HIVE
> is required service for SPARK2.
>
>
> Regards
>
> Vitalyi
> ------------------------------
> *Від:* Lian Jiang <jiangok2...@gmail.com>
> *Надіслано:* 26 липня 2018 р. 22:08
> *Кому:* user@ambari.apache.org
> *Тема:* Re: install HDP3.0 using ambari
>
> During migration, spark2 and livy2-server fail to start due to:
>
> 2018-07-26 18:18:09,024 - The 'livy2-server' component did not advertise a 
> version. This may indicate a problem with the component packaging. However, 
> the stack-select tool was able to report a single version installed 
> (3.0.0.0-1634). This is the version that will be reported.
> Traceback (most recent call last):
>   File 
> "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/livy2_server.py",
>  line 148, in <module>
>     LivyServer().execute()
>   File 
> "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", 
> line 353, in execute
>     method(env)
>   File 
> "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/livy2_server.py",
>  line 43, in install
>     import params
>   File 
> "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/params.py",
>  line 220, in <module>
>     if hive_metastore_db_type == "mssql":
>   File 
> "/usr/lib/ambari-agent/lib/resource_management/libraries/script/config_dictionary.py",
>  line 73, in __getattr__
>     raise Fail("Configuration parameter '" + self.name + "' was not found in 
> configurations dictionary!")
> resource_management.core.exceptions.Fail: Configuration parameter 'hive-env' 
> was not found in configurations dictionary!
>
> I observed that:
> /var/lib/ambari-agent/cache/stacks/HDP/2.6/services/YARN/configuration/yarn-site.xml
>  does have:
>
> <property>
>     <name>yarn.nodemanager.kill-escape.user</name>
>     <value>hive</value>
>     <depends-on>
>       <property>
>         <type>hive-env</type>
>         <name>hive_user</name>
>       </property>
>     </depends-on>
>     <on-ambari-upgrade add="false"/>
>   </property>
>
> /var/lib/ambari-agent/cache/stacks/HDP/3.0/services/YARN/configuration/yarn-site.xml
>  doesn't.
>
> Below files are the same:
>
> /var/lib/ambari-agent/cache/stacks/HDP/2.6/services/SPARK2/configuration/livy2-env.xml
>
> vs
> /var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/configuration/livy2-env.xml
>
>
>
> /var/lib/ambari-agent/cache/stacks/HDP/2.6/services/SPARK2/configuration/livy2-conf.xml
>
> vs
> /var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/configuration/livy2-conf.xml
>
>
> I don't see anything wrong in
> /var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/configuration/spark2-defaults.xml
> /var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/configuration/spark2-env.xml
>
> either.
>
>
> Any idea will be highly appreciated! Thanks.
>
>
> On Tue, Jul 24, 2018 at 3:56 PM, Lian Jiang <jiangok2...@gmail.com> wrote:
>
>> Thanks. I will try 1 given that I cannot find enough documents/examples
>> online for the blueprint schema changes online.
>>
>> On Tue, Jul 24, 2018 at 3:49 PM, Benoit Perroud <ben...@noisette.ch>
>> wrote:
>>
>>> HDP 3 don’t have any more spark (1.x), only spark2.
>>>
>>> In general, old blueprints are not fully compatible and have to be
>>> tweaked a bit.
>>>
>>> I see two options from where you are:
>>>
>>> 1) Upgrade your current blueprint, i.e. use it with HDP 2.6+, run the
>>> upgrade wizard from Ambari 2.7 to HDP 3, and export a new version of the
>>> blueprint.
>>> 2) Manually update the blueprint and remove the spark-defaults section
>>> it has. This is still not giving you the guarantee the blueprint will work,
>>> you might need to do more customisation.
>>>
>>> Benoit
>>>
>>>
>>>
>>>
>>> On 25 Jul 2018, at 00:05, Lian Jiang <jiangok2...@gmail.com> wrote:
>>>
>>> Thanks Benoit for the advice.
>>>
>>> I switched to ambari 2.7. However, when I create the cluster, it failed
>>> due to "config types are not defined in the stack: [spark-defaults]".
>>>
>>> Below links point to a spec < ambari2.7.
>>> https://cwiki.apache.org/confluence/display/AMBARI/Blueprint
>>> s#Blueprints-BlueprintStructure
>>> https://docs.hortonworks.com/HDPDocuments/Ambari-2.7.0.0/adm
>>> inistering-ambari/content/amb_using_ambari_blueprints.html
>>>
>>> https://github.com/apache/ambari/tree/release-2.7.0/ambari-s
>>> erver/src/main/resources/stacks/HDP does not have HDP3.0. This makes it
>>> hard to troubleshoot.
>>>
>>> Do you know where I can find the source code of HDP3.0 ambari stack so
>>> that I can check what configs are supported in new ambari?
>>>
>>> Thanks.
>>>
>>>
>>>
>>> On Mon, Jul 23, 2018 at 2:35 PM, Benoit Perroud <ben...@noisette.ch>
>>> wrote:
>>>
>>>> Are you using Ambari 2.7?
>>>>
>>>> Make sure you upgrade Ambari to 2.7 first, since this version is
>>>> required for HDP 3
>>>>
>>>> Benoit
>>>>
>>>>
>>>> On 23 Jul 2018, at 23:32, Lian Jiang <jiangok2...@gmail.com> wrote:
>>>>
>>>> Hi,
>>>>
>>>> I am using ambari blueprint to install HDP 3.0 and cannot register the
>>>> vdf file.
>>>>
>>>> The vdf file is (the url works):
>>>>
>>>> {
>>>>   "VersionDefinition": {
>>>>      "version_url": "http://public-repo-1.hortonwo
>>>> rks.com/HDP/centos7/3.x/updates/3.0.0.0/HDP-3.0.0.0-1634.xml"
>>>>   }
>>>> }
>>>>
>>>> The error is "An internal system exception occurred: Stack data, Stack
>>>> HDP 3.0 is not found in Ambari metainfo"
>>>>
>>>> Any idea? Thanks.
>>>>
>>>>
>>>>
>>>
>>>
>>
>

Reply via email to