![login jupyter notebook online login jupyter notebook online](https://i0.wp.com/community.datacamp.com.s3.amazonaws.com/community/production/ckeditor_assets/pictures/202/content_jupyternotebook7.gif)
If these are indeed logs (meaning stuff that is useful to understand what happened, but not required as long as everything is working OK), a dataset may be a slightly atypical idea.
![login jupyter notebook online login jupyter notebook online](https://ajitjohnson.com/assets/images/2020-07-14-setup-jupyter-notebook-python-R/jupyter_notebook_newpage.png)
The volume will be relatively light with hundreds to maybe thousands of text rows when in Debug like this. What you see above is likely a debug level. Later on, I'll likely want to reduce the flow. Trimming over time would be useful? This job might run from 1 to say 144 times a day, indefinitely. So a fixed schema for each data element may be hard. Note the results form each step of the process is different. I currently have the results showing up on screen through a number of print statements on a cell.
![login jupyter notebook online login jupyter notebook online](https://miro.medium.com/max/1400/1*ezJx8ZEu1Va14iscq_h5Gg.png)
Wondering what approach folks would suggest for managing the output from a Jupyter Notebook in the DSS environment.
LOGIN JUPYTER NOTEBOOK ONLINE SERIES
I've got a Jupyter Notebook setup to do a series of ETL and Writeback tasks that I want to Log.