문재인 정권 2년…탈원전, 최저임금, 52시간, 비정규직 정규직화, 한반도 평화 등 그럴듯한 얘기였지만 결과는 참담했습니다. 산업기반을 망가뜨리고 고용참사에 자영업 중소기업 폐업도산, 생활물가 고통으로 중산층이 몰락중입니다. 노동기득권 담합 이익 지키기에 신규채용 진입장벽 만들고 노동시장 경쟁을 없애 생산성 하락에 기업 경쟁력 추락, 결국 한국경제는 세계경쟁에서 초고속 추락중입니다. 우리 사회의 부의 축적원리와 번영의 길이 뿌리째 흔들리면서 국민들 모두 '경제하려는 의지'가 상실되고 있지요. 북한 비핵화는 어려울 걸로 보이고 미일중 주변국과의 외교도 파탄상태입니다. 이쯤 되면 문재인 정권이 내세우는 주요 정책들이 실패했다는 건 분명합니다.
그런데, 많은 분들이 문재인 정권이 “선한 의도를 갖고 시작했지만 결과는 참담하다. 그러니 이제 방향을 수정하라”고 말합니다. 결과가 참담하니 방향을 수정해야 하는 것은 당연합니다. 그런데 제 의문은 바로 그 '선한 의도'라는 게 과연 맞냐는 겁니다. 만일 이 참담한 결과가 실은 '의도되었던 것'이라면요? '무서운 의도' 아닙니까? 저도 처음엔 무지해서 그렇지 선한 의도, 약자에 대한 정의감에서 나온 정책이려니 했습니다. 그런데 말입니다. 사실 1년쯤 지나서 그 정책들이 결과적으로 의도한 효과가 나타나지 않고 폐단만 압도적인데 왜 전혀 듣지도 고민하지도 않고 수정할 생각을 하지 않는 건지 의문이 들더군요. 혹 '선한 의도'가 아니라 '무서운 의도'라면요?...
그래서 저는 문재인 정권은 선한 의도를 가진 게 아니다, 이대로 가면 한국 경제가 추락할 거란 걸 다 알고 있다고 생각합니다. 설마하니 정부에도 경제학자 관료출신들 다 있는데 이런 기초적인 걸 모를 리가 있습니까? 이렇게 해서 경제가 침체되는 현상이 바로 공동생산 공동소유 공동분배라는 사회주의 원리가 스멀스멀 체질화될 때 열심히 경제하려는 의지가 상실되어 아무도 열심히 노동하고 투자하지 않는 사회가 되어 몰락하는 것 아닙니까? 지금 그 현상이 나라 곳곳에서 벌어지는데 결국은 전세계 사회주의가 어떻게 되었는지 이미 다 알고 있을 거 아닙니까?
중요한 건…그런 나라가 되어도 권력자들은 나쁠 게 없다는 겁니다. 생산 소유 분배를 권력자들이 점차 독점하고 운영하고 관리하는 것이니 그들의 세상이 되는 겁니다. 문재인 정권의 진짜 의도는 그렇게 국민들에게 사회주의를 체질화시켜 자신들이 권력으로 부를 독점하여 운영 관리하고 그 이권과 자리를 누리고 장기집권하는 것이 아닐까요?
얼마간 불만이 있겠지만 떠날 사람 떠나고 중산층 몰락하고 나면 모두가 권력이 던져주는 배급과 지원에 줄서게 되어 있습니다. 왜 베네주엘라가 저 모양이 되어도 마두로 통치에 제대로 저항하지 못하겠습니까? 국민들이 모두 자생력 독립성을 잃고 권력의 노예가 된 것이지요. 여러분, 아직도 문재인 정권의 선한 의도를 믿습니까? 만일 제 말대로 '무서운 의도'라면요? (발췌)
--->이념에 빠지면 주위의 현상을 이념에 따라 주관적으로만 해석한다. 좌파들은 선한 의도라기 보다는, 장미빗 환상에 빠져 자기들의 좌파 정책을 실현하면 정말 살기좋고 평화로운 세상이 도래한다고 믿었다. 물론 현실을 그걸 부정하고 있지만 그들의 이념은 그런 현실을 외면하고 자기들만의 굴 속에 들어가 환상이 가져다주는 꿈을 꾸었다. 지금의 경제 파탄은 그 환상의 결과일 뿐이다.
----------------------------------------------------------------
김문수
문재인 대통령이 국군통수권자(헌법 74조)라는 생각보다는, 무장해제자라는 생각이 먼저 드는 것은 무엇 때문일까요? 국군이 통수권자로 인하여 강군이 되는 것이 아니라, 전력약화만 일어나고 있는 현실이 두렵습니다.
평화를 바랄수록 전쟁에 대비해야 하지 않나요? 유비무환 아닙니까? 무비유환의 길로만 달려가는 문재인 통수권자에게는 우리민족끼리만 있고, 주적도 한미동맹도 사라져버렸습니다.
김정은의 핵도 미사일도 보이지 않나 봅니다. 사드로도 패트리엇으로도 막을 수 없는 이스칸데르 미사일을 발사한 김정은 참관 사진을 보고도, 김정은 변호인 노릇만 하고 있는 국군통수권자를 탄핵하지 않고서, 대한민국을 어떻게 지킬 수 있겠습니까?
---------------------------------------------------------------
---------------------------------------------------------------------
Democratic People's Repubic of Korea
라고 섰다고함 가로세로 연구소 라이브 에서 공개 했는데
미국판 집구매 내역 공식 서류인데 아무나 그냥 볼수있는 서류 거기에 집을 사면서 국적을 북한 이라고 씀
[출처] 가로세로 연구소 손혜원 뉴욕 집살때 DPRK 로 국적 표기함
https://youtu.be/GCMt_BD6yxg 25분
--------------------------------------------------------------------------------------
시진핑은 무역 협정을 타결할 의향이 없다.--------------------------------------------------------------------------------------
--------------------------------------------------------
최신 학습시보의 내용.
위대한 투쟁의 장기성, 복잡성, 막중함을 반드시 충분히 인식하고, 경제, 정치, 문화, 사회, 외교, 군사 등 각종 부문에서 투쟁을 준비하며, 어떤 형식의 모순된 도전에도 준비를 철저히 한다.
------------------------------------------------------------------
시진핑이 협상자들이 제안한 양보안을 거부했다. 그리고 그에 따른 결과를 자신이 책임을 지겠다고 말했다.
--------------------------------------------------------------
미국 정치에서 세대간 분열이 증가하고 있으며, 이는 전통적인 정치 분석의 대상이었던 인종이나 계층의 간극보다 더 중요한 요인이 될 수가 있다.
민주당이 z 세대와 새천년둥이 두 세대를 정치적 세력으로 묶으면, 그들은 다가오는 선거에서 권력을 잡을 수 있을 것이다.
----------------------------------------------------------------------------------
비비씨에서 과학적 근거가 있다며 현대의 농업이 야생 생물에 나쁘다고 했지만,
많은 연구에 따르면 그 반대가 옳다.
-------------------------------------------------------------------------------
I am writing this review in response to some confusion and unfairness I see in other reviews. Cover and Thomas have written a unique and ambitious introduction to a fascinating and complex subject; their book must be judged fairly and not compared to other books that have entirely different goals.
Claude Shannon provided a working definition of "information" in his seminal 1948 paper, A Mathematical Theory of Communication. Shannon's interest in that and subsequent papers was the attainment of reliable communication in noisy channels. The definition of information that Shannon gave was perfectly fitted to this task; indeed, it is easily shown that in the context studied by Shannon, the only meaningful measure of information content that will apply to random variables with known distribution must be (up to a multiplicative constant) of the now-familiar form h(p) = log(1/p).
However, Shannon freely admitted that his definition of information was limited in scope and was never envisioned as being universal. Shannon deliberately avoided the "murkier" aspects of human communication in framing his definitions; problematic themes such as knowledge, semantics, motivations and intentions of the sender and/or receiver, etc., were avoided altogether.
For several decades, Information Theory continued to exist as a subset of the theory of reliable communication. Some classical and highly regarded texts on the subject are Gallager, Ash, Viterbi and Omura, and McEliece. For those whose interest in Information Theory is motivated largely by questions from the field of digital communications, these texts remain unrivalled standards; Gallager, in particular, is so highly regarded by those who learned from it that it is still described as superior to many of its more recent, up-to-date successors.
In recent decades, Information Theory has been applied to problems from across a wide array of academic disciplines. Physicists have been forced to clarify the extent to which information is conserved in order to completely understand black hole dynamics; biologists have found extensive use of Information Theoretic concepts in understanding the human genome; computer scientists have applied Information Theory to complex issues in computational vs. descriptive complexity (the Mandelbrot set, which has been called the most complex set in all of mathematics, is actually extremely simple from the point of view of Kolmogorov complexity); and John von Neumann's brilliant creation, game theory, which has been called "a universal language for the unification of the behavioral sciences," is intimately coupled to Information Theory, perhaps in ways that have not yet been fully appreciated or explored.
Cover and Thomas' book "Elements of Information Theory" is written for the reader who is interested in these eclectic and exciting applications of Information Theory. This book does NOT treat Information Theory as a subset of reliable communication theory; therefore, the book is NOT written as a competitor for Gallager's classic text. Critics who ask
for a more thorough treatment of rate distortion theory or convolutional codes are criticizing the authors for failing to include topics that are not even central to their goals for the text!
A very selective list of some of the more interesting topics that Cover and Thomas study includes: (1) the Asymptotic Equipartition Property and its consequences for data compression; (2) Information Theory and gambling; (3) Kolmogorov complexity and Chaitin's Omega; (4) Information Theory and statistics; and (5) Information Theory and the stock market. Item (4) on this list is only briefly introduced in Cover and Thomas's book, and appropriately so; however, readers who wish to pursue the fascinating subject of Fischer Information further should consider B. Roy Frieden's book Physics from Fisher Information: A Unification. Frieden identifies a principle of "extreme physical information" as a unifying theme across all of physics, deriving such classic equations as the Klein-Gordon equation, Maxwell's equations, and Einstein's field equations for general relativity from this information-theoretic principle.
This last point is quite typical of Cover and Thomas's book. I participated in a faculty seminar on Information Thoery at my university a few years ago, in which we studied Cover and Thomas as our primary source. We were a diverse group, drawn from five different academic disciplines, and we all found that Cover and Thomas repeatedly introduced us to exciting and unexpected applications of Information Theory, always sending us to the journals for further, more in-depth study.
Cover and Thomas' book has become an established favorite in university courses on information theory. In truth, the book has few competitors. Interested readers looking for additional references might also consider David MacKay's book Information Theory, Inference, and Learning Algorithms, which has as a primary goal the use of information theory in the study of Neural Networks and learning algorithms. George Klir's book Uncertainty and Information considers many alternative measures of information/uncertainty, moving far beyond the classical log(1/p) measure of Shannon and the context in which it arose. Jan Kahre's iconoclastic book The Mathematical Theory of Information is an intriguing alternative in which the so-called Law of Diminishing Information is elevated to primary axiomatic status in deriving measures of information content. I alluded to some of the "murkier" issues of human communication earlier; readers who wish to study some of those issues will find Yehoshua Bar-Hillel's book Language and Information a useful source.
In conclusion, I highly recommend Cover and Thomas' book on Information Theory. It is currently unrivalled as a rigorous introduction to applications of Information Theory across the curriculum. As a person who used to work in the general area of signals analysis, I resist all comparisons of Cover and Thomas' book with the classic text of Gallager; the books have vastly different goals and very little overlap. (아마존 독자 서평)
Claude Shannon provided a working definition of "information" in his seminal 1948 paper, A Mathematical Theory of Communication. Shannon's interest in that and subsequent papers was the attainment of reliable communication in noisy channels. The definition of information that Shannon gave was perfectly fitted to this task; indeed, it is easily shown that in the context studied by Shannon, the only meaningful measure of information content that will apply to random variables with known distribution must be (up to a multiplicative constant) of the now-familiar form h(p) = log(1/p).
However, Shannon freely admitted that his definition of information was limited in scope and was never envisioned as being universal. Shannon deliberately avoided the "murkier" aspects of human communication in framing his definitions; problematic themes such as knowledge, semantics, motivations and intentions of the sender and/or receiver, etc., were avoided altogether.
For several decades, Information Theory continued to exist as a subset of the theory of reliable communication. Some classical and highly regarded texts on the subject are Gallager, Ash, Viterbi and Omura, and McEliece. For those whose interest in Information Theory is motivated largely by questions from the field of digital communications, these texts remain unrivalled standards; Gallager, in particular, is so highly regarded by those who learned from it that it is still described as superior to many of its more recent, up-to-date successors.
In recent decades, Information Theory has been applied to problems from across a wide array of academic disciplines. Physicists have been forced to clarify the extent to which information is conserved in order to completely understand black hole dynamics; biologists have found extensive use of Information Theoretic concepts in understanding the human genome; computer scientists have applied Information Theory to complex issues in computational vs. descriptive complexity (the Mandelbrot set, which has been called the most complex set in all of mathematics, is actually extremely simple from the point of view of Kolmogorov complexity); and John von Neumann's brilliant creation, game theory, which has been called "a universal language for the unification of the behavioral sciences," is intimately coupled to Information Theory, perhaps in ways that have not yet been fully appreciated or explored.
Cover and Thomas' book "Elements of Information Theory" is written for the reader who is interested in these eclectic and exciting applications of Information Theory. This book does NOT treat Information Theory as a subset of reliable communication theory; therefore, the book is NOT written as a competitor for Gallager's classic text. Critics who ask
for a more thorough treatment of rate distortion theory or convolutional codes are criticizing the authors for failing to include topics that are not even central to their goals for the text!
A very selective list of some of the more interesting topics that Cover and Thomas study includes: (1) the Asymptotic Equipartition Property and its consequences for data compression; (2) Information Theory and gambling; (3) Kolmogorov complexity and Chaitin's Omega; (4) Information Theory and statistics; and (5) Information Theory and the stock market. Item (4) on this list is only briefly introduced in Cover and Thomas's book, and appropriately so; however, readers who wish to pursue the fascinating subject of Fischer Information further should consider B. Roy Frieden's book Physics from Fisher Information: A Unification. Frieden identifies a principle of "extreme physical information" as a unifying theme across all of physics, deriving such classic equations as the Klein-Gordon equation, Maxwell's equations, and Einstein's field equations for general relativity from this information-theoretic principle.
This last point is quite typical of Cover and Thomas's book. I participated in a faculty seminar on Information Thoery at my university a few years ago, in which we studied Cover and Thomas as our primary source. We were a diverse group, drawn from five different academic disciplines, and we all found that Cover and Thomas repeatedly introduced us to exciting and unexpected applications of Information Theory, always sending us to the journals for further, more in-depth study.
Cover and Thomas' book has become an established favorite in university courses on information theory. In truth, the book has few competitors. Interested readers looking for additional references might also consider David MacKay's book Information Theory, Inference, and Learning Algorithms, which has as a primary goal the use of information theory in the study of Neural Networks and learning algorithms. George Klir's book Uncertainty and Information considers many alternative measures of information/uncertainty, moving far beyond the classical log(1/p) measure of Shannon and the context in which it arose. Jan Kahre's iconoclastic book The Mathematical Theory of Information is an intriguing alternative in which the so-called Law of Diminishing Information is elevated to primary axiomatic status in deriving measures of information content. I alluded to some of the "murkier" issues of human communication earlier; readers who wish to study some of those issues will find Yehoshua Bar-Hillel's book Language and Information a useful source.
In conclusion, I highly recommend Cover and Thomas' book on Information Theory. It is currently unrivalled as a rigorous introduction to applications of Information Theory across the curriculum. As a person who used to work in the general area of signals analysis, I resist all comparisons of Cover and Thomas' book with the classic text of Gallager; the books have vastly different goals and very little overlap. (아마존 독자 서평)
-----------------------------------------------------------------------------------------------------------------------------------
많은 것들이 우리가 이해하지 못할 뿐만 아니라, 우리보다 더 영리한 논리를 지니고 있다는 사실을 인정하는 일은 대단한 지혜이고 자제력이다.
--->동물이나 곤충, 심지어 식물들이 나름의 삶의 지혜를 갖추고 있다는 사실을 발견할 때, 나 역시 깜짝 놀란다.
------------------------------------------------------------------------------------------------------------------------------------
[출처] 문재앙을 가장 추앙한 것은 40대 X세대 쓰레기들
---------------------------------------------------------------------------------------------------------
동일한 에너지를 생산하는데, 태양광은 원자력발전보다 450배, 풍력은 천연 가스 유정보다 700배 더 넓은 토지를 요구한다.
----------------------------------------------------------------------------------------------------
쓰레기로 만든 작품
Emily 2018년 2월 1일
댓글 없음:
댓글 쓰기