Search on De Novo

De Novo

  • Home
  • Posts
  • About
  • Search...   |
  • RSS

Understanding Skip Gram

08 February 2021, Monday   machine learning   natural language processing  

Skip gram is one of the most utilized word embedding model to date. It was introduced at the same time with continuous bag-of-words (CBoW) together named as Word2Vec by Google researchers.
» continue reading


Understanding Neural Probabilistic Language Model

04 February 2021, Thursday   machine learning   natural language processing  

Neural Probabilistic Language Model (NPLM for short; Bengio et al., 2003) was a turning point when it comes to word embedding. Based on the n-gram language model and as an end-to-end model it proved that a neural network trained on predicting the following word given n-gram can be useful in embedding lexical context into vectors.
» continue reading


Setting up M1 Mac for both TensorFlow and PyTorch

29 January 2021, Friday   machine learning  

Macs with ARM64-based M1 chip, launched shortly after Apple’s initial announcement of their plan to migrate to Apple Silicon, got quite a lot of attention both from consumers and developers. It became headlines especially because of its outstanding performance, not in the ARM64-territory, but in all PC industry.
» continue reading


4.8. Optional stopping theorem

22 January 2021, Friday   probability   Durrett  

In this section, we generalize the bounded version of optional stopping. After that as an example we will cover theorem regarding assymetric random walk.
» continue reading


4.5. Square integrable martingales

22 January 2021, Friday   probability   Durrett  

In this section, we look into martingales with special property - square integrability. Square integrability gives martingale an upper bound for maximal expectation so that it can further be used to determine the convergence of the sequence.
» continue reading


« Prev 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 Next »
Archive
Theme Simple by wildflame © 2016
-