Customer cannot submit Spark job in InsightEdge version 15.0 with specific Kubernetes versions

Description

  • Originally customer was getting '403 Forbidden' error

2019-12-04 20:37:58,474 [OkHttp https://kubernetes.default.svc/...] WARN - Exec Failure: HTTP 403, Status: 403 -
java.net.ProtocolException: Expected HTTP 101 response but was '403 Forbidden'
at okhttp3.internal.ws.RealWebSocket.checkResponse(RealWebSocket.java:216)
at okhttp3.internal.ws.RealWebSocket$2.onResponse(RealWebSocket.java:183)
at okhttp3.RealCall$AsyncCall.execute(RealCall.java:141)
at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
2019-12-04 20:37:58,478 [main] WARN - Kubernetes client has been closed (this is expected if the application is shutting down.)

For example (snippet from Dockerfile):

FROM gigaspaces/insightedge-enterprise:15.0

RUN rm /opt/gigaspaces/insightedge/spark/jars/kubernetes-client-4.1.2.jar
RUN rm /opt/gigaspaces/insightedge/spark/jars/kubernetes-model-4.1.2.jar
RUN rm /opt/gigaspaces/insightedge/spark/jars/kubernetes-model-common-4.1.2.jar

ADD kubernetes-client-4.4.2.jar /opt/gigaspaces/insightedge/spark/jars/
ADD kubernetes-model-4.4.2.jar /opt/gigaspaces/insightedge/spark/jars/
ADD kubernetes-model-common-4.4.2.jar /opt/gigaspaces/insightedge/spark/jars/

  • Once this was done the customer did not get the '403 Forbidden' error.
    However, they would get the following error:

2019-12-05 20:46:16,357 [task-result-getter-0] WARN - Lost task 63.0 in stage 54.0 (TID 13033, 172.30.107.18, executor 72): java.lang.NoClassDefFoundError: org/openspaces/core/space/SpaceConfigurer

at org.insightedge.spark.utils.GridProxyFactory$.org$insightedge$spark$utils$GridProxyFactory$$createSpaceProxy(GridProxyFactory.scala:41)
at org.insightedge.spark.utils.GridProxyFactory$$anonfun$getOrCreateClustered$1.apply(GridProxyFactory.scala:35)
at org.insightedge.spark.utils.GridProxyFactory$$anonfun$getOrCreateClustered$1.apply(GridProxyFactory.scala:35)
at org.insightedge.spark.utils.LocalCache.org$insightedge$spark$utils$LocalCache$$updateIfRequired(LocalCache.scala:43)
at org.insightedge.spark.utils.LocalCache$$anonfun$getOrElseUpdate$1.apply(LocalCache.scala:31)
at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)
at scala.collection.AbstractMap.getOrElse(Map.scala:59)
at org.insightedge.spark.utils.LocalCache.getOrElseUpdate(LocalCache.scala:31)
at org.insightedge.spark.utils.GridProxyFactory$.getOrCreateClustered(GridProxyFactory.scala:35)
at org.insightedge.spark.rdd.InsightEdgeRDDFunctions$$anonfun$saveToGrid$1.apply(InsightEdgeRDDFunctions.scala:50)
at org.insightedge.spark.rdd.InsightEdgeRDDFunctions$$anonfun$saveToGrid$1.apply(InsightEdgeRDDFunctions.scala:49)

  • Our response was to suggest to revert to the previous version of k8s. However, it's not possible on this customer's infrastructure.

  • Now we are at an impasse and customer would like this resolved post haste.

Workaround

None

Acceptance Test

Verified by regression

Assignee

Alon Shoham

Reporter

Dixson Huie

Labels

None

Priority

Critical

SalesForce Case ID

None

Fix versions

Commitment Version/s

None

Due date

None

Product

None

Edition

Open Source

Platform

All
Configure