[ https://issues.apache.org/jira/browse/SPARK-13232?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Marcelo Vanzin resolved SPARK-13232. ------------------------------------ Resolution: Won't Fix It seems the conclusion in the bug is that this is a YARN bug (YARN-4925 possibly) and that people who run into this should upgrade, or change their Spark app config. If YARN is still broken, then well, another YARN bug should be filed. > YARN executor node label expressions > ------------------------------------ > > Key: SPARK-13232 > URL: https://issues.apache.org/jira/browse/SPARK-13232 > Project: Spark > Issue Type: Improvement > Components: YARN > Environment: Scala 2.11.7, Hadoop 2.7.2, Spark 1.6.0 > Reporter: Atkins > Priority: Minor > > Using node label expression for executor failed to request container request > and throws *InvalidContainerRequestException*. > The code > {code:title=AMRMClientImpl.java} > /** > * Valid if a node label expression specified on container request is valid > or > * not > * > * @param containerRequest > */ > private void checkNodeLabelExpression(T containerRequest) { > String exp = containerRequest.getNodeLabelExpression(); > > if (null == exp || exp.isEmpty()) { > return; > } > // Don't support specifying >= 2 node labels in a node label expression > now > if (exp.contains("&&") || exp.contains("||")) { > throw new InvalidContainerRequestException( > "Cannot specify more than two node labels" > + " in a single node label expression"); > } > > // Don't allow specify node label against ANY request > if ((containerRequest.getRacks() != null && > (!containerRequest.getRacks().isEmpty())) > || > (containerRequest.getNodes() != null && > (!containerRequest.getNodes().isEmpty()))) { > throw new InvalidContainerRequestException( > "Cannot specify node label with rack and node"); > } > } > {code} > doesn't allow node label with rack and node. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org