{"id":16665,"date":"2020-01-19T23:25:57","date_gmt":"2020-01-20T04:25:57","guid":{"rendered":"https:\/\/bangla.salearningschool.com\/recent-posts\/?p=16665"},"modified":"2020-02-08T09:41:54","modified_gmt":"2020-02-08T14:41:54","slug":"bayesian-statistics-and-machine-learning","status":"publish","type":"post","link":"http:\/\/bangla.sitestree.com\/?p=16665","title":{"rendered":"Bayesian Statistics and Machine Learning"},"content":{"rendered":"<p><strong>Bayesian Statistics and Machine Learning<br \/>\n<\/strong><br \/>\n<strong>&#8220;Bayesian inference<\/strong> is a method of statistical <strong>inference<\/strong> in which <strong>Bayes<\/strong>&#8216; theorem is used to update the probability for a hypothesis as more evidence or information becomes available. <strong>Bayesian inference<\/strong> is an important technique in statistics, and especially in mathematical statistics.&#8221;<\/p>\n<p><a href=\"https:\/\/en.wikipedia.org\/wiki\/Bayesian_inference\">en.wikipedia.org \u203a wiki \u203a Bayesian_inference<br \/>\n<\/a><\/p>\n<h3><a href=\"https:\/\/en.wikipedia.org\/wiki\/Bayesian_inference\">Bayesian inference &#8211; Wikipedia<\/a><\/h3>\n<p>&#8220;Firstly, (statistical) inference is the process of deducing properties about a population or probability distribution from data&#8221;<\/p>\n<p>&#8220;Bayesian inference is therefore just the process of deducing properties about a population or probability distribution from data <em>using Bayes\u2019 theorem.<\/em> That\u2019s it.&#8221;<br \/>\n<a href=\"https:\/\/towardsdatascience.com\/probability-concepts-explained-bayesian-inference-for-parameter-estimation-90e8930e5348\">https:\/\/towardsdatascience.com\/probability-concepts-explained-bayesian-inference-for-parameter-estimation-90e8930e5348<\/a><\/p>\n<h1>Introduction to Bayesian Inference<\/h1>\n<p><a href=\"https:\/\/blogs.oracle.com\/datascience\/introduction-to-bayesian-inference\">https:\/\/blogs.oracle.com\/datascience\/introduction-to-bayesian-inference<\/a><\/p>\n<p>&#8221;<\/p>\n<h1>Bayesian Linear Regression<\/h1>\n<p>In the Bayesian viewpoint, we formulate linear regression using probability distributions rather than point estimates. The response, y, is not estimated as a single value, but is assumed to be drawn from a probability distribution. The model for Bayesian Linear Regression with the response sampled from a normal distribution is:&#8221;<\/p>\n\n\t\t<style type=\"text\/css\">\n\t\t\t#gallery-1 {\n\t\t\t\tmargin: auto;\n\t\t\t}\n\t\t\t#gallery-1 .gallery-item {\n\t\t\t\tfloat: left;\n\t\t\t\tmargin-top: 10px;\n\t\t\t\ttext-align: center;\n\t\t\t\twidth: 33%;\n\t\t\t}\n\t\t\t#gallery-1 img {\n\t\t\t\tborder: 2px solid #cfcfcf;\n\t\t\t}\n\t\t\t#gallery-1 .gallery-caption {\n\t\t\t\tmargin-left: 0;\n\t\t\t}\n\t\t\t\/* see gallery_shortcode() in wp-includes\/media.php *\/\n\t\t<\/style>\n\t\t<div id='gallery-1' class='gallery galleryid-16665 gallery-columns-3 gallery-size-thumbnail'><dl class='gallery-item'>\n\t\t\t<dt class='gallery-icon landscape'>\n\t\t\t\t<a href='http:\/\/bangla.sitestree.com\/?attachment_id=16666'><img loading=\"lazy\" decoding=\"async\" width=\"150\" height=\"111\" src=\"https:\/\/i0.wp.com\/bangla.sitestree.com\/wp-content\/uploads\/2020\/01\/image-1.png?resize=150%2C111\" class=\"attachment-thumbnail size-thumbnail\" alt=\"\" srcset=\"https:\/\/i0.wp.com\/bangla.sitestree.com\/wp-content\/uploads\/2020\/01\/image-1.png?resize=150%2C111 150w, https:\/\/i0.wp.com\/bangla.sitestree.com\/wp-content\/uploads\/2020\/01\/image-1.png?zoom=2&amp;resize=150%2C111 300w\" sizes=\"auto, (max-width: 150px) 100vw, 150px\" \/><\/a>\n\t\t\t<\/dt><\/dl><dl class='gallery-item'>\n\t\t\t<dt class='gallery-icon landscape'>\n\t\t\t\t<a href='http:\/\/bangla.sitestree.com\/?attachment_id=16667'><img loading=\"lazy\" decoding=\"async\" width=\"150\" height=\"102\" src=\"https:\/\/i0.wp.com\/bangla.sitestree.com\/wp-content\/uploads\/2020\/01\/image-2.png?resize=150%2C102\" class=\"attachment-thumbnail size-thumbnail\" alt=\"\" srcset=\"https:\/\/i0.wp.com\/bangla.sitestree.com\/wp-content\/uploads\/2020\/01\/image-2.png?resize=150%2C102 150w, https:\/\/i0.wp.com\/bangla.sitestree.com\/wp-content\/uploads\/2020\/01\/image-2.png?zoom=2&amp;resize=150%2C102 300w\" sizes=\"auto, (max-width: 150px) 100vw, 150px\" \/><\/a>\n\t\t\t<\/dt><\/dl>\n\t\t\t<br style='clear: both' \/>\n\t\t<\/div>\n\n<p><a href=\"https:\/\/towardsdatascience.com\/introduction-to-bayesian-linear-regression-e66e60791ea7\">https:\/\/towardsdatascience.com\/introduction-to-bayesian-linear-regression-e66e60791ea7<\/a><\/p>\n<h1>&#8220;Bayesian model selection<\/h1>\n<pre><a href=\"http:\/\/alumni.media.mit.edu\/~tpminka\/\">Tom Minka<\/a><\/pre>\n<p><img data-recalc-dims=\"1\" decoding=\"async\" src=\"https:\/\/i0.wp.com\/alumni.media.mit.edu\/~tpminka\/statlearn\/demo\/occam.gif?w=750\" alt=\"occam.gif\" \/>Bayesian model selection uses the rules of probability theory to select among different hypotheses. It is completely analogous to Bayesian classification.&#8221;<\/p>\n<p>&#8221;<\/p>\n<p>&#8221;<\/p>\n<p><a href=\"http:\/\/alumni.media.mit.edu\/~tpminka\/statlearn\/demo\/\">http:\/\/alumni.media.mit.edu\/~tpminka\/statlearn\/demo\/<\/a><\/p>\n<p><a href=\"https:\/\/www.countbayesie.com\/blog\/2019\/6\/12\/logistic-regression-from-bayes-theorem\">Logistic Regression from Bayes&#8217; Theorem<\/a><\/p>\n<p><a href=\"https:\/\/www.countbayesie.com\/blog\/2019\/6\/12\/logistic-regression-from-bayes-theorem\">https:\/\/www.countbayesie.com\/blog\/2019\/6\/12\/logistic-regression-from-bayes-theorem<\/a><\/p>\n<p>Kernel Trick and Kernels<br \/>\n<a href=\"https:\/\/svivek.com\/teaching\/lectures\/slides\/svm\/kernels.pdf\">https:\/\/svivek.com\/teaching\/lectures\/slides\/svm\/kernels.pdf<\/a><\/p>\n<p>&#8220;When talking about kernels in machine learning, most likely the first thing that comes into your mind is the support vector machines (SVM)&#8221;<\/p>\n<p><a href=\"https:\/\/medium.com\/@zxr.nju\/what-is-the-kernel-trick-why-is-it-important-98a98db0961d\">https:\/\/medium.com\/@zxr.nju\/what-is-the-kernel-trick-why-is-it-important-98a98db0961d<\/a><\/p>\n<p>Gaussian Processes<\/p>\n<p>&#8220;or why I don\u2019t use SVMs&#8221;<\/p>\n<p><a href=\"https:\/\/mlss2011.comp.nus.edu.sg\/uploads\/Site\/lect1gp.pdf\">https:\/\/mlss2011.comp.nus.edu.sg\/uploads\/Site\/lect1gp.pdf<\/a><\/p>\n<p>Gaussian Process Classification and Active Learning with Multiple Annotators<\/p>\n<p><a href=\"http:\/\/proceedings.mlr.press\/v32\/rodrigues14.pdf\">http:\/\/proceedings.mlr.press\/v32\/rodrigues14.pdf<\/a><\/p>\n<p><strong>&#8220;Assumed density filtering<\/strong> is an online inference algorithm. that incrementally updates the posterior over W after ob- serving new evidence.&#8221;<\/p>\n<p><a href=\"https:\/\/www.aaai.org\/ocs\/index.php\/AAAI\/AAAI16\/paper\/download\/12391\/11777\">https:\/\/www.aaai.org\/ocs\/index.php\/AAAI\/AAAI16\/paper\/download\/12391\/11777<\/a><\/p>\n<p>&#8220;<strong>Expectation propagation<\/strong> (EP) is a technique in Bayesian machine learning. EP finds approximations to a probability distribution. It uses an iterative approach that leverages the factorization structure of the target distribution.<\/p>\n<p><a href=\"https:\/\/en.wikipedia.org\/wiki\/Expectation_propagation\">en.wikipedia.org \u203a wiki \u203a Expectation_propagation<\/a><\/p>\n<h3>Expectation propagation &#8211; Wikipedia<\/h3>\n<p>&#8221;<\/p>\n<p><strong>rejection sampling<\/strong><\/p>\n<p>&#8221;<\/p>\n<p>In numerical analysis and <a title=\"Computational statistics\" href=\"https:\/\/en.wikipedia.org\/wiki\/Computational_statistics\">computational statistics<\/a>, <strong>rejection sampling<\/strong> is a basic technique used to generate observations from a <a title=\"Probability distribution\" href=\"https:\/\/en.wikipedia.org\/wiki\/Probability_distribution\">distribution<\/a>. It is also commonly called the <strong>acceptance-rejection method<\/strong> or &#8220;accept-reject algorithm&#8221; and is a type of exact simulation method. The method works for any distribution in <img decoding=\"async\" src=\"https:\/\/wikimedia.org\/api\/rest_v1\/media\/math\/render\/svg\/6a87a024931038d1858dc22e8a194e5978c3412e\" alt=\"\\mathbb {R} ^{m}\" \/> with a <a title=\"Probability density function\" href=\"https:\/\/en.wikipedia.org\/wiki\/Probability_density_function\">density<\/a>.<\/p>\n<p><a href=\"https:\/\/en.wikipedia.org\/wiki\/Rejection_sampling\">https:\/\/en.wikipedia.org\/wiki\/Rejection_sampling<\/a><\/p>\n<p>&#8221;<\/p>\n<p>Bayesian Deep learning<\/p>\n<p>&#8220;However, graphical networks such as <strong>Bayesian<\/strong> networks are useful to model uncertainty along with causal inference and logic deduction. In this regard, <strong>Bayesian Deep learning<\/strong> combines perception (<strong>deep learning<\/strong>) with strong probabilistic inference which can estimate uncertainty.<\/p>\n<p><a href=\"https:\/\/www.quora.com\/How-different-is-Bayesian-deep-learning-from-deep-learning\">www.quora.com \u203a How-different-is-Bayesian-deep-learning-from-deep-&#8230;<\/a><\/p>\n<h3><a href=\"https:\/\/www.quora.com\/How-different-is-Bayesian-deep-learning-from-deep-learning\">How different is Bayesian deep learning from deep learning? &#8211; Quora<\/a><\/h3>\n<p>&#8221;<\/p>\n<p>*** . . *** . ***<br \/>\n<em><strong>Note: Older short-notes from this site are posted on Medium: <\/strong><\/em><a href=\"https:\/\/medium.com\/@SayedAhmedCanada\/r\">https:\/\/medium.com\/@SayedAhmedCanada<\/a><\/p>\n<p>*** . *** *** . *** . *** . ***<\/p>\n<p><em><strong>Sayed Ahmed<\/strong><br \/>\n<\/em><br \/>\n<em><strong>BSc. Eng. in Comp. Sc. &amp; Eng. (BUET)<\/strong><\/em><br \/>\n<em><strong>MSc. in Comp. Sc. (U of Manitoba, Canada)<\/strong><\/em><br \/>\n<em><strong>MSc. in Data Science and Analytics (Ryerson University, Canada)<\/strong><\/em><br \/>\n<em><strong>Linkedin<\/strong>: <a href=\"https:\/\/ca.linkedin.com\/in\/sayedjustetc\">https:\/\/ca.linkedin.com\/in\/sayedjustetc<\/a><br \/>\n<\/em><\/p>\n<p><em><strong>Blog<\/strong>: <a href=\"http:\/\/bangla.salearningschool.com\/\">http:\/\/Bangla.SaLearningSchool.com<\/a>, <a href=\"http:\/\/sitestree.com\">http:\/\/SitesTree.com<\/a><\/em><br \/>\n<em><strong>Online and Offline Training<\/strong>: <a href=\"http:\/\/training.SitesTree.com\">http:\/\/Training.SitesTree.com<\/a> (Also, can be free and low cost sometimes)<\/em><\/p>\n<p><em>Facebook Group\/Form to discuss (Q &amp; A): <\/em><a href=\"https:\/\/www.facebook.com\/banglasalearningschool\">https:\/\/www.facebook.com\/banglasalearningschool<\/a><\/p>\n<p>Our free or paid training events: <a href=\"https:\/\/www.facebook.com\/justetcsocial\">https:\/\/www.facebook.com\/justetcsocial<\/a><\/p>\n<p><em>Get access to courses on Big Data, Data Science, AI, Cloud, Linux, System Admin, Web Development and Misc. related. Also, create your own course to sell to others. <\/em><a href=\"http:\/\/sitestree.com\/training\/\">http:\/\/sitestree.com\/training\/<\/a><\/p>\n<p><em><strong>I<\/strong>f you want to contribute to occasional free and\/or low cost online\/offline training or charitable\/non-profit work in the education\/health\/social service sector, you can financially contribute to: safoundation at <a href=\"http:\/\/salearningschool.com\">salearningschool.com<\/a> using Paypal or Credit Card (on <\/em><a href=\"http:\/\/sitestree.com\/training\/enrol\/index.php?id=114\">http:\/\/sitestree.com\/training\/enrol\/index.php?id=114<\/a> <em>).<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Bayesian Statistics and Machine Learning &#8220;Bayesian inference is a method of statistical inference in which Bayes&#8216; theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Bayesian inference is an important technique in statistics, and especially in mathematical statistics.&#8221; en.wikipedia.org \u203a wiki \u203a Bayesian_inference Bayesian inference &#8211; Wikipedia &hellip; <\/p>\n<p><a class=\"more-link btn\" href=\"http:\/\/bangla.sitestree.com\/?p=16665\">Continue reading<\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[1910,1908,182],"tags":[],"class_list":["post-16665","post","type-post","status-publish","format-standard","hentry","category-ai-ml-ds-rl-dl-nn-nlp-data-mining-optimization","category-math-and-statistics-for-data-science-and-engineering","category---blog","item-wrap"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack-related-posts":[{"id":22629,"url":"http:\/\/bangla.sitestree.com\/?p=22629","url_meta":{"origin":16665,"position":0},"title":"Resources for Data Science Based Project Development","author":"Sayed","date":"March 16, 2021","format":false,"excerpt":"Projects mentioned on: http:\/\/sitestree.com\/prediction-bayesian-regression-concepts-example-projects\/ Insurance HealthCare Costs: https:\/\/github.com\/techshot25\/HealthCare Linear and Bayesian modeling in R: Predicting movie popularity https:\/\/towardsdatascience.com\/linear-and-bayesian-modelling-in-r-predicting-movie-popularity-6c8ef0a44184 Bayesian-Stock-Price-Prediction https:\/\/github.com\/lschlessinger1\/Bayesian-Stock-Price-Prediction Bayesian Prediction: Well (Oil) Production https:\/\/github.com\/jpgrana\/bayesian-approach-predicting-well-production Binary Classification on Stock Market (S&P 500) using Naive Bayes and Logistic Regression https:\/\/github.com\/NeilPrabhu\/Stock-Prediction Naive Bayes Weather Prediction https:\/\/github.com\/husnainfareed\/simple-naive-bayes-weather-prediction\/blob\/master\/bayes.py Regression Predict Fuel Efficiency: https:\/\/www.tensorflow.org\/tutorials\/keras\/basic_regression\u2026","rel":"","context":"In &quot;\u09ac\u09cd\u09b2\u0997 \u0964 Blog&quot;","block_context":{"text":"\u09ac\u09cd\u09b2\u0997 \u0964 Blog","link":"http:\/\/bangla.sitestree.com\/?cat=182"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":20577,"url":"http:\/\/bangla.sitestree.com\/?p=20577","url_meta":{"origin":16665,"position":1},"title":"Machine Learning (and related) Resources as may have consulted in the past","author":"Sayed","date":"February 23, 2021","format":false,"excerpt":"References Projects mentioned on: http:\/\/sitestree.com\/prediction-bayesian-regression-concepts-example-projects\/ Insurance HealthCare Costs: https:\/\/github.com\/techshot25\/HealthCare Linear and Bayesian modeling in R: Predicting movie popularity https:\/\/towardsdatascience.com\/linear-and-bayesian-modelling-in-r-predicting-movie-popularity-6c8ef0a44184 Bayesian-Stock-Price-Prediction https:\/\/github.com\/lschlessinger1\/Bayesian-Stock-Price-Prediction Bayesian Prediction: Well (Oil) Production https:\/\/github.com\/jpgrana\/bayesian-approach-predicting-well-production Binary Classification on Stock Market (S&P 500) using Naive Bayes and Logistic Regression https:\/\/github.com\/NeilPrabhu\/Stock-Prediction Naive Bayes Weather Prediction https:\/\/github.com\/husnainfareed\/simple-naive-bayes-weather-prediction\/blob\/master\/bayes.py Regression Predict Fuel Efficiency:\u2026","rel":"","context":"In &quot;\u09ac\u09cd\u09b2\u0997 \u0964 Blog&quot;","block_context":{"text":"\u09ac\u09cd\u09b2\u0997 \u0964 Blog","link":"http:\/\/bangla.sitestree.com\/?cat=182"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":14889,"url":"http:\/\/bangla.sitestree.com\/?p=14889","url_meta":{"origin":16665,"position":2},"title":"Prediction: Bayesian: Regression: Concepts\/Example Projects","author":"Sayed","date":"July 7, 2019","format":false,"excerpt":"Insurance HealthCare Costs: https:\/\/github.com\/techshot25\/HealthCare Linear and Bayesian modeling in R: Predicting movie popularity https:\/\/towardsdatascience.com\/linear-and-bayesian-modelling-in-r-predicting-movie-popularity-6c8ef0a44184 Bayesian-Stock-Price-Prediction https:\/\/github.com\/lschlessinger1\/Bayesian-Stock-Price-Prediction Bayesian Prediction: Well (Oil) Production https:\/\/github.com\/jpgrana\/bayesian-approach-predicting-well-production Binary Classification on Stock Market (S&P 500) using Naive Bayes and Logistic Regression https:\/\/github.com\/NeilPrabhu\/Stock-Prediction Naive Bayes Weather Prediction https:\/\/github.com\/husnainfareed\/simple-naive-bayes-weather-prediction\/blob\/master\/bayes.py Regression Predict Fuel Efficiency: https:\/\/www.tensorflow.org\/tutorials\/keras\/basic_regression Regression-Example-Predicting-House-Prices https:\/\/github.com\/andersy005\/deep-learning\/blob\/master\/keras\/04-A-Regression-Example-Predicting-House-Prices.ipynb Stock Price\u2026","rel":"","context":"In &quot;AI ML DS RL DL NN NLP Data Mining Optimization&quot;","block_context":{"text":"AI ML DS RL DL NN NLP Data Mining Optimization","link":"http:\/\/bangla.sitestree.com\/?cat=1910"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":16619,"url":"http:\/\/bangla.sitestree.com\/?p=16619","url_meta":{"origin":16665,"position":3},"title":"Math\/Stat\/CS\/DS Topics that you need to know (with Cognitive, Psychomotor, Affective domain skills) to become a true and great Data Scientist","author":"Sayed","date":"January 7, 2020","format":false,"excerpt":"\"The core topics are cross-validation, shrinkage methods (ridge regression, the LASSO, etc.), neural networks, gradient boosting, separating hyperplanes, support vector machines, basis expansion and regularization (e.g., smoothing splines, wavelet smoothing, kernel smoothing), generalized additive models, bump hunting, multivariate adaptive regression splines (MARS), self-organizing maps, mixture model-based clustering, ensemble learning, and\u2026","rel":"","context":"In &quot;Math and Statistics for Data Science, and Engineering&quot;","block_context":{"text":"Math and Statistics for Data Science, and Engineering","link":"http:\/\/bangla.sitestree.com\/?cat=1908"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":78217,"url":"http:\/\/bangla.sitestree.com\/?p=78217","url_meta":{"origin":16665,"position":4},"title":"Model Selection","author":"Sayed","date":"May 21, 2025","format":false,"excerpt":"\u2022 Optimizations\/Machine Learning\/Data Mining\/Deep Learning\/Reinforcement Learning\/Graph Mining\/NLP\/Genetic Algorithms \u2022 Regression \u2022 Linear \u2022 Non-Linear \u2022 Classifications \u2022 Logistics Regression \u2022 Sigmoid : Binary \u2022 Softmax: Multi-Class \u2022 Bayes Classifier \u2022 SVM \u2022 Bayesian: Regression\/Classification \u2022 Clustering \u2022 K-NN \u2022 KNN+ \u2022 Kmeans, Hierarchical, Density \u2022Machine Learning\/Data Mining\/Deep Learning\/Reinforcement Learning\/Graph Mining\/NLP\u2026","rel":"","context":"In &quot;Analytics and Machine Learning Project Development&quot;","block_context":{"text":"Analytics and Machine Learning Project Development","link":"http:\/\/bangla.sitestree.com\/?cat=1974"},"img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":16736,"url":"http:\/\/bangla.sitestree.com\/?p=16736","url_meta":{"origin":16665,"position":5},"title":"Misc Basic Statistics for Data Science","author":"Sayed","date":"February 2, 2020","format":false,"excerpt":"Hypergeometric Distribution \"In probability theory and statistics, the hypergeometric distribution is a discrete probability distribution that describes the probability of successes (random draws for which the object drawn has a specified feature) in draws, without replacement, from a finite population of size that contains exactly objects with that feature, wherein\u2026","rel":"","context":"In &quot;AI ML DS RL DL NN NLP Data Mining Optimization&quot;","block_context":{"text":"AI ML DS RL DL NN NLP Data Mining Optimization","link":"http:\/\/bangla.sitestree.com\/?cat=1910"},"img":{"alt_text":"","src":"https:\/\/i0.wp.com\/bangla.salearningschool.com\/wp-content\/uploads\/2020\/02\/image.png?resize=350%2C200","width":350,"height":200,"srcset":"https:\/\/i0.wp.com\/bangla.salearningschool.com\/wp-content\/uploads\/2020\/02\/image.png?resize=350%2C200 1x, https:\/\/i0.wp.com\/bangla.salearningschool.com\/wp-content\/uploads\/2020\/02\/image.png?resize=525%2C300 1.5x"},"classes":[]}],"_links":{"self":[{"href":"http:\/\/bangla.sitestree.com\/index.php?rest_route=\/wp\/v2\/posts\/16665","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/bangla.sitestree.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/bangla.sitestree.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/bangla.sitestree.com\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"http:\/\/bangla.sitestree.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=16665"}],"version-history":[{"count":4,"href":"http:\/\/bangla.sitestree.com\/index.php?rest_route=\/wp\/v2\/posts\/16665\/revisions"}],"predecessor-version":[{"id":16731,"href":"http:\/\/bangla.sitestree.com\/index.php?rest_route=\/wp\/v2\/posts\/16665\/revisions\/16731"}],"wp:attachment":[{"href":"http:\/\/bangla.sitestree.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=16665"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/bangla.sitestree.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=16665"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/bangla.sitestree.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=16665"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}