forked from RyanZotti/Self-Driving-Car
-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Ryan Zotti
authored and
Ryan Zotti
committed
Oct 23, 2016
1 parent
31e3ec2
commit 6e7575f
Showing
1 changed file
with
24 additions
and
3 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -6,8 +6,7 @@ Run thes | |
|
||
# Log in as Hadoop user and make ec2-user directories | ||
# If you don't do this step your Spark code will immediately fail with permission issues | ||
sudo su | ||
su hadoop | ||
sudo su hadoop | ||
hadoop fs -mkdir -p /user/ec2-user | ||
hadoop fs -chown ec2-user /user/ec2-user | ||
|
||
|
@@ -29,4 +28,26 @@ Run thes | |
export MASTER="yarn-client" | ||
|
||
# Start up pysparking | ||
/home/ec2-user/sparkling-water-1.6.8/bin/pysparkling | ||
/home/ec2-user/sparkling-water-1.6.8/bin/pysparkling --deploy-mode client | ||
|
||
# Note the shell's tracking URL, which will look something like this: | ||
http://ip-10-0-0-123.ec2.internal:20888/proxy/application_1477154041215_0004/ | ||
|
||
# Open up a new Terminal tab. We're doing to do port forwarding / ssh tunneling to view the Spark UI | ||
ssh -i /Users/ryanzotti/Documents/private_keys/ML.pem -L 20888:[localhost]:20888 [email protected] | ||
|
||
# Open up your web browser to the tracking URL. Replace the IP with localhost | ||
http://localhost:20888/proxy/application_1477154041215_0004/ | ||
|
||
|
||
## FAQ | ||
|
||
**Question:** You get a never-ending stream of: | ||
|
||
Client: Application report for application_1477154041215_0013 (state: ACCEPTED) | ||
|
||
**Answer:** You probably have too many simultaneous (potentially abandoned) spark shells running. | ||
|
||
ps -ef | grep -i spark | ||
kill -9 <spark process id> | ||
|