Skip to content

Commit

Permalink
docs: fix typos (#1470)
Browse files Browse the repository at this point in the history
  • Loading branch information
Mayureshd-18 authored Nov 1, 2023
1 parent dc3564b commit ce3ec6c
Show file tree
Hide file tree
Showing 4 changed files with 6 additions and 6 deletions.
6 changes: 3 additions & 3 deletions .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,11 +13,11 @@ about: Create a report to help us improve
cd neural_prophet
pip install .
```
* Checked the Answered Questions on the Github Disscussion board: https://github.com/ourownstory/neural_prophet/discussions
* Checked the Answered Questions on the Github Discussion board: https://github.com/ourownstory/neural_prophet/discussions
If you have the same question but the Answer does not solve your issue, please continue the conversation there.
* Checked that your issue isn't already filed: https://github.com/ourownstory/neural_prophet/issues
If you have the same issue but there is a twist to your situation, please add an explanation there.
* Considered whether your bug might actually be solveable by getting a question answered:
* Considered whether your bug might actually be solvable by getting a question answered:
* Please [post a package use question](https://github.com/ourownstory/neural_prophet/discussions/categories/q-a-get-help-using-neuralprophet)
* Please [post a forecasting best practice question](https://github.com/ourownstory/neural_prophet/discussions/categories/q-a-forecasting-best-practices)
* Please [post an idea or feedback](https://github.com/ourownstory/neural_prophet/discussions/categories/ideas-feedback)
Expand Down Expand Up @@ -48,7 +48,7 @@ Describe what happens, and how often it happens.
If applicable, add screenshots and console printouts to help explain your problem.
**Environement (please complete the following information):**
**Environment (please complete the following information):**
- Python environment [e.g. Python 3.8, in standalone venv with no other packages]
- NeuralProphet version and install method [e.g. 2.7, installed from PYPI with `pip install neuralprophet`]
Expand Down
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/feature_request.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ about: Suggest an idea for this project
**Prerequisites**

* [ ] Put an X between the brackets on this line if you have done all of the following:
* Checked the Answered Questions on the Github Disscussion board: https://github.com/ourownstory/neural_prophet/discussions
* Checked the Answered Questions on the Github Discussion board: https://github.com/ourownstory/neural_prophet/discussions
If you have the same question but the Answer does not solve your issue, please continue the conversation there.
* Checked that your issue isn't already filed: https://github.com/ourownstory/neural_prophet/issues
If you have the same issue but there is a twist to your situation, please add an explanation there.
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ pip install .


### Framework features
* Multiple time series: Fit a global/glocal model with (partially) shared model parameters
* Multiple time series: Fit a global/local model with (partially) shared model parameters
* Uncertainty: Estimate values of specific quantiles - Quantile Regression
* Regularize modeling components
* Plotting of forecast components, model coefficients and more
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ NeuralProphet is fit with stochastic gradient descent - more precisely, with an
If the parameter `learning_rate` is not specified, a learning rate range test is conducted to determine the optimal learning rate.
The `epochs`, `loss_func` and `optimizer` are other parameters that directly affect the model training process.
If not defined, `epochs`and `loss_func` are automatically set based on the dataset size. They are set in a manner that controls the total number training steps to be around 1000 to 4000.
NeuralProphet offers to set two different values for `optimizer`, namely `AdamW` and `SDG` (stochastic gradient decsent).
NeuralProphet offers to set two different values for `optimizer`, namely `AdamW` and `SDG` (stochastic gradient descent).

If it looks like the model is overfitting to the training data (the live loss plot can be useful hereby),
you can reduce `epochs` and `learning_rate`, and potentially increase the `batch_size`.
Expand Down

0 comments on commit ce3ec6c

Please sign in to comment.