Understanding Skip Gram
Skip gram is one of the most utilized word embedding model to date. It was introduced at the same time with continuous bag-of-words (CBoW) together named as Word2Vec by Google researchers.
» continue reading
Understanding Neural Probabilistic Language Model
Neural Probabilistic Language Model (NPLM for short; Bengio et al., 2003) was a turning point when it comes to word embedding. Based on the n-gram language model and as an end-to-end model it proved that a neural network trained on predicting the following word given n-gram can be useful in embedding lexical context into vectors.
» continue reading
Setting up M1 Mac for both TensorFlow and PyTorch
Macs with ARM64-based M1 chip, launched shortly after Apple’s initial announcement of their plan to migrate to Apple Silicon, got quite a lot of attention both from consumers and developers. It became headlines especially because of its outstanding performance, not in the ARM64-territory, but in all PC industry.
» continue reading
4.8. Optional stopping theorem
In this section, we generalize the bounded version of optional stopping. After that as an example we will cover theorem regarding assymetric random walk.
» continue reading
4.5. Square integrable martingales
In this section, we look into martingales with special property - square integrability. Square integrability gives martingale an upper bound for maximal expectation so that it can further be used to determine the convergence of the sequence.
» continue reading