Тёмный
datadrenaline
datadrenaline
datadrenaline
Подписаться
Комментарии
@ojitha
@ojitha Год назад
spark-master:7077 is not working. localhost not working. Any solutions?
@senthilkumarashokkumar8153
@senthilkumarashokkumar8153 Год назад
check docker-compose has port forwarding 7077 to access outside docker
@bensberg3968
@bensberg3968 Год назад
WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master spark-master:7077
@sparkstudent
@sparkstudent Год назад
I'm getting error running the code since other IP distinct than spark server or local host, this could be the issue? I've publish the port on my server but I still get "An existing connection was forcibly closed by the remote host", any help?
@VijayKumar-bt1bb
@VijayKumar-bt1bb Год назад
Could you please make a video on running Pyspark jobs on Airflow 2.4.0 with bitnami Spark 3.3.1. I have been trying to run the pyspark jobs but getting "Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources" error though i can see worker node has enough memory and active in spark master UI.
@vinayakkgarg2263
@vinayakkgarg2263 2 года назад
How can we use docker to set up spark workers on a different node(server) ?
@tieduprightnowprcls
@tieduprightnowprcls Год назад
try to expose 7077 port in the spark-master container so that it can be accessed publicly or privately. then those workers on the different server have it set the SPARK_MASTER_URL to the spark master IP:7077
@doniuppa
@doniuppa 2 года назад
Won't start, I'm using windows. spark-livy_1 | Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: SPARK_HOME path does not exist spark-livy_1 | at scala.Predef$.require(Predef.scala:224) spark-livy_1 | at org.apache.livy.utils.LivySparkUtils$.testSparkHome(LivySparkUtils.scala:56) spark-livy_1 | at org.apache.livy.server.LivyServer.start(LivyServer.scala:77) spark-livy_1 | at org.apache.livy.server.LivyServer$.main(LivyServer.scala:423) spark-livy_1 | at org.apache.livy.server.LivyServer.main(LivyServer.scala) spark_docker_spark-livy_1 exited with code 1 Stopping spark_docker_spark-worker_1 ... Stopping spark_docker_spark-master_1 ...
@ramsescoraspe
@ramsescoraspe 2 года назад
Why the app in spark UI is still running?
@gtosXD
@gtosXD 2 года назад
Waiting for the next one! Amazing content
@piyushjain3763
@piyushjain3763 2 года назад
Wonderful video... Was getting error in my code for similar thing... Resolved with this video.... Waiting for next video...
@mateusleao7093
@mateusleao7093 2 года назад
it would be good if we heard an answer, had the same problem as the guys, when pointing to the image tag it can't fint it, then when pointed to localhsot:7077 it says the worker has no resources.
@jaykiran15
@jaykiran15 2 года назад
Did you get this sorted? I've the same issue
@HariharApamarjane-yi1qo
@HariharApamarjane-yi1qo 7 месяцев назад
I am also facing the same issue. Please let us know if you folks find any solution.@jaykiran15 @mateusleao7093
@thejam3184
@thejam3184 3 года назад
when are you guys releasing the next part of the video in which we execute spark jobs using livy rest API?
@gtosXD
@gtosXD 3 года назад
Great!!! Thanks for sharing this!
@amanjain9352
@amanjain9352 3 года назад
Great video. I am facing issues while running the code, error says can't connect to master. Can you help
@makaranddeshpande2621
@makaranddeshpande2621 2 года назад
I am getting same
@TheProKiller397
@TheProKiller397 2 года назад
Change master to "spark://localhost:7077" should fix the error :D
@ojitha
@ojitha Год назад
@@TheProKiller397 localhost is not working. Did you find any solutions?
@dodat12
@dodat12 3 месяца назад
@@TheProKiller397 Tysm <3333
@gurumoorthy810
@gurumoorthy810 3 года назад
unable to access the spark master using spark://spark-master:7077 in local, getting DNS resolution. Any solutions??
@priyankabaswa378
@priyankabaswa378 3 года назад
try localhost:8080
@Ironfeet996
@Ironfeet996 3 года назад
@@priyankabaswa378 I'm facing a similar issue then I changed this spark.master to os.environ.get("SPARK_MASTER_URL", "spark://localhost:7077") , now I'm getting a different error "Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources" , even I removed the mem and core setting on the worker the error persists
@JavierHernandez-iy1dp
@JavierHernandez-iy1dp 3 года назад
hey, thanks, just 1 question if I may, I just started a docker from raw.githubusercontent.com/bitnami/bitnami-docker-spark/master/docker- compose.yml using docker-compose up... I have checked that de scala and spark versions are correct, I also copied your sparkSession configuration and set my master url. I am trying to run the program but getting: WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master f85ba43781d0:7077 Many thanks
@ojitha
@ojitha Год назад
url cannot find
@zinukchakma5442
@zinukchakma5442 3 года назад
Nice 👍
@ojitha
@ojitha Год назад
This is not working.