A partially observable Markov decision process (POMDP) can be informally defined as a world in which an agent can take actions and gain rewards. The world has a set of possible world states S. This set can be finite (e.g. mine sweeper) or infinite (e. g. a robotic car in a parking lot). The world is only partially observable, so if we try to program an agent to act in this world, the robot does not know the state of the entire world, rather it only gets observations that partially reveal the state of the world. The set of possible observations is typically called Ω. The agent in a POMDP can take actions. The actions set is usually called A. The actions affect the world and result in rewards which also depend on the state of the world. (Technically, for every world state s and action a there is a reward R(s, a). R(s, a) is a real number. Also, for every world state s and action a there is a probability distribution of new possible world states that result after taking action a when the world is in state s.)
May 2015
You are currently browsing the monthly archive for May 2015.
Categories
- Abstraction for Learning (9)
- Assorted Links (3)
- Astronomy (6)
- Category Theory (7)
- Clustering (8)
- Complexity (5)
- Compressed Sensing (1)
- Control Systems (1)
- Deep Belief Networks (28)
- Economics (2)
- Ensemble Learning (12)
- Games (48)
- General ML (39)
- Graphical Models (16)
- Information Theory (11)
- Investing (2)
- Languages (14)
- Logic (7)
- Math (45)
- Multi-Armed Bandit Problem (27)
- Neural Nets (28)
- Optimization (20)
- PDEs (1)
- Programming (8)
- Reinforcement Learning (15)
- Robots (13)
- Sparsity (5)
- Statistics (21)
- Support Vector Machines (3)
- Technology (6)
- Uncategorized (25)
Archives
- August 2024 (1)
- February 2024 (5)
- January 2024 (1)
- November 2023 (1)
- October 2023 (2)
- September 2023 (1)
- June 2023 (1)
- May 2023 (4)
- April 2023 (1)
- March 2023 (1)
- February 2023 (4)
- January 2023 (2)
- December 2022 (1)
- July 2022 (2)
- June 2022 (1)
- April 2022 (1)
- May 2021 (2)
- April 2021 (1)
- March 2021 (1)
- February 2021 (3)
- January 2021 (1)
- December 2020 (3)
- October 2020 (1)
- July 2020 (1)
- May 2020 (1)
- April 2020 (1)
- May 2019 (1)
- September 2018 (1)
- August 2018 (1)
- May 2017 (1)
- April 2017 (2)
- April 2016 (3)
- March 2016 (3)
- November 2015 (1)
- May 2015 (1)
- March 2015 (2)
- January 2015 (2)
- December 2014 (3)
- November 2014 (1)
- September 2014 (2)
- August 2014 (3)
- July 2014 (2)
- June 2014 (1)
- April 2014 (2)
- March 2014 (5)
- February 2014 (1)
- January 2014 (3)
- December 2013 (5)
- November 2013 (4)
- October 2013 (5)
- September 2013 (5)
- August 2013 (4)
- July 2013 (5)
- June 2013 (4)
- May 2013 (2)
- April 2013 (14)
- March 2013 (15)
- February 2013 (14)
- January 2013 (18)
- December 2012 (17)
- November 2012 (19)
- October 2012 (15)
- September 2012 (22)
- August 2012 (26)
- July 2012 (18)