Install the Crestron CH5 Utilities CLI

npm install -g @crestron/ch5-utilities-cli

Create a Project using the CH5 Shell CLI

ch5-shell-cli create:project

Create a Git Ignore file for the Node Modules




npm run start

Open Browser


to change from light theme to dark theme



“projectName”: “shell-template”,

“version”: “0.0.1”,

“faviconPath”: “favicon.ico”,

“menuOrientation”: “horizontal”,

“selectedTheme”: “dark-theme”,

“useWebXPanel”: true,



Bidirectional Encoder Representations from Transformers (BERT) is a Transformer-based machine learning technique for natural language processing (NLP) pre-training developed by Google.

BERT was created and published in 2018 by Jacob Devlin and his colleagues from Google.

As of 2019, Google has been leveraging BERT to better understand user searches.

The original English-language BERT has two models: (1) the BERTBASE: 12 Encoders with 12 bidirectional self-attention heads, and (2) the BERTLARGE: 24 Encoders with 24 bidirectional self-attention heads. Both models are pre-trained from unlabeled data extracted from the BooksCorpus with 800M words and English Wikipedia with 2,500M words.