Jupyter as a Service On Lentiq
Data science teams will love the flexibility of Jupyter Notebook offered as a Service on Lentiq. They can write code in Python, scale as they go and share the code and notebook with other team members. This increases collaboration and shortens the time to production. Great results come from great collaboration.
Jupyter as a Service running on Lentiq gives you the freedom to share your projects with other team members and increase collaboration and efficiency.
Working in Jupyter is the ultimate way to train models, but what if you could share your discoveries with the team, get feedback, ideas, more insights and improve? It's not an unrealistic expectation. It's a service on Lentiq: shareable, portable code and Notebooks.
1
Kelly from the marketing data team shares a customer behavior analysis model and associated data.
2
Jim is looking for something similar for his customer churn prediction use case, analyzes the work, runs the experiment, spots an unintended error and signals Kelly.
3
Kelly is correcting her error, improves the model and republishes it, and Jim extends it for his own use case.
Data experts are part of data teams for a reason: they handle data. We handle the backstage. It's automated, anyway.
Accelerate model training and use distributed computing to match the size of the dataset, and the complexity of the task.
Channel your energy on the science part of data rather than on what happens under the hood with an out-of-the-box, automated and integrated infrastructure management.
Just create a Spark application, connect to it from Jupyter and start crunching data at scale.Scale clusters based on application requirements instantly and don't worry about infrastructure.
Initialize Spark Context
from pyspark.sql import SparkSession
spark = SparkSession.builder\
.master("spark://35.228.151.102:7077")\
.getOrCreate()
For all Spark functions to be available, a Spark context has to be initialized in the current notebook.
Get instant access to an interactive, curated, pre-configured and extensible data science environment that offers you everything you need.