Re: Need suggestion on how to configure HCP Big Data for Development and Testing
Alright thanks a lot for the links. :-) On Sat, Oct 7, 2017 at 2:12 AM, James Sirotawrote: > As I mentioned in my previous response, https://community.hortonworks. > com/topics/Metron.html > is where you want to go for help with Hortonworks products > > 06.10.2017, 05:34, "Dima Kovalyov" : > > Hello Ashikin, > > > > HCP is Hortonworks product and they have installation document here: > > https://docs.hortonworks.com/HDPDocuments/HCP1/HCP-1.2.0/ > bk_installation/content/getting_started.html > > Chapter that you are looking for is below: > > https://docs.hortonworks.com/HDPDocuments/HCP1/HCP-1.3.0/ > bk_installation/content/installing_configuring_deploying_hdp_cluster.html > > > > Dell specific wording: Dell PowerEdge VRTX, M630 and HDD 6006 does not > > tell me much about hardware you have. But if I have two guess, I would > > suggest to have: > > 1 node for Ambari and hdfs, hbase, zookeeper, metrics, zeppelin, spark, > > hive, tez and yarn masters. > > 1 node for Metron and storm, kafka, metron, es, kibana > > 2 nodes + 1 node (ambari one) for all the slaves and masters that > > require replication (hdfs datanode, zookeeper, kafka broker, es slaves). > > > > I have not tried HCP myself, but if I would, I would rather post all my > > HCP related questions to HortonWorks forums > > (https://community.hortonworks.com/index.html, they have really good > > support there) rather then to Apache Metron devs as they are not related > > (HortonWorks uses Apache Metron (open-source software) in their HDP > > framework). > > > > - Dima > > > > On 10/05/2017 04:59 PM, Ashikin Abdullah wrote: > >> Hi, can anyone help me to suggest appropriate deployment for > Hortonworks > >> Cybersecurity Package within this environment. We have Dell PowerEdge > VRTX > >> with 4 nodes, M630 x 4 and HDD 6006 x 25 (shared storage). > >> > >> Therefore, how to manage all this resources to properly configured HCP? > >> > >> Thanks in advance. > > --- > Thank you, > > James Sirota > PPMC- Apache Metron (Incubating) > jsirota AT apache DOT org >
Re: Need suggestion on how to configure HCP Big Data for Development and Testing
As I mentioned in my previous response, https://community.hortonworks.com/topics/Metron.html is where you want to go for help with Hortonworks products 06.10.2017, 05:34, "Dima Kovalyov": > Hello Ashikin, > > HCP is Hortonworks product and they have installation document here: > https://docs.hortonworks.com/HDPDocuments/HCP1/HCP-1.2.0/bk_installation/content/getting_started.html > Chapter that you are looking for is below: > https://docs.hortonworks.com/HDPDocuments/HCP1/HCP-1.3.0/bk_installation/content/installing_configuring_deploying_hdp_cluster.html > > Dell specific wording: Dell PowerEdge VRTX, M630 and HDD 6006 does not > tell me much about hardware you have. But if I have two guess, I would > suggest to have: > 1 node for Ambari and hdfs, hbase, zookeeper, metrics, zeppelin, spark, > hive, tez and yarn masters. > 1 node for Metron and storm, kafka, metron, es, kibana > 2 nodes + 1 node (ambari one) for all the slaves and masters that > require replication (hdfs datanode, zookeeper, kafka broker, es slaves). > > I have not tried HCP myself, but if I would, I would rather post all my > HCP related questions to HortonWorks forums > (https://community.hortonworks.com/index.html, they have really good > support there) rather then to Apache Metron devs as they are not related > (HortonWorks uses Apache Metron (open-source software) in their HDP > framework). > > - Dima > > On 10/05/2017 04:59 PM, Ashikin Abdullah wrote: >> Hi, can anyone help me to suggest appropriate deployment for Hortonworks >> Cybersecurity Package within this environment. We have Dell PowerEdge VRTX >> with 4 nodes, M630 x 4 and HDD 6006 x 25 (shared storage). >> >> Therefore, how to manage all this resources to properly configured HCP? >> >> Thanks in advance. --- Thank you, James Sirota PPMC- Apache Metron (Incubating) jsirota AT apache DOT org
Re: Need suggestion on how to configure HCP Big Data for Development and Testing
Hello Ashikin, HCP is Hortonworks product and they have installation document here: https://docs.hortonworks.com/HDPDocuments/HCP1/HCP-1.2.0/bk_installation/content/getting_started.html Chapter that you are looking for is below: https://docs.hortonworks.com/HDPDocuments/HCP1/HCP-1.3.0/bk_installation/content/installing_configuring_deploying_hdp_cluster.html Dell specific wording: Dell PowerEdge VRTX, M630 and HDD 6006 does not tell me much about hardware you have. But if I have two guess, I would suggest to have: 1 node for Ambari and hdfs, hbase, zookeeper, metrics, zeppelin, spark, hive, tez and yarn masters. 1 node for Metron and storm, kafka, metron, es, kibana 2 nodes + 1 node (ambari one) for all the slaves and masters that require replication (hdfs datanode, zookeeper, kafka broker, es slaves). I have not tried HCP myself, but if I would, I would rather post all my HCP related questions to HortonWorks forums (https://community.hortonworks.com/index.html, they have really good support there) rather then to Apache Metron devs as they are not related (HortonWorks uses Apache Metron (open-source software) in their HDP framework). - Dima On 10/05/2017 04:59 PM, Ashikin Abdullah wrote: > Hi, can anyone help me to suggest appropriate deployment for Hortonworks > Cybersecurity Package within this environment. We have Dell PowerEdge VRTX > with 4 nodes, M630 x 4 and HDD 6006 x 25 (shared storage). > > Therefore, how to manage all this resources to properly configured HCP? > > Thanks in advance. >
Need suggestion on how to configure HCP Big Data for Development and Testing
Hi, can anyone help me to suggest appropriate deployment for Hortonworks Cybersecurity Package within this environment. We have Dell PowerEdge VRTX with 4 nodes, M630 x 4 and HDD 6006 x 25 (shared storage). Therefore, how to manage all this resources to properly configured HCP? Thanks in advance.