Yingyu liang (yliang)
Yingyu Liang, a new faculty member in machine learning in the University of Wisconsin-Madison Department of Computer Sciences, enjoys bringing his disparate interests together. In his free time, he enjoys hobbies ranging from badminton to classical Chinese poetry. Within computer science, he was drawn to machine learning because it pulls together so many things that fascinate him.
After finishing his Ph. There, he did a postdoc and then spent two years as a research scholar and lecturer. He arrived in Madison with his wife and toddler son in fall Lubar Professorship.三生三世十里桃花 Eternal Love 片尾曲MV【涼涼】楊宗緯 張碧晨演唱版 （a.k.a. Ten Miles of Peach Blossoms）
Then, while earning his Ph. As he explains, even though there are many successful applications of machine learning already, there is little understanding about underlying theoretical principles. Liang is also eager to contribute to progress in natural language processing NLP —the part of computer science that deals with the interaction between computers and human languages. For example, NLP comes into play when you interact with a personal digital assistant like Siri, Alexa or Cortana on a smartphone or in-home device like Amazon Echo.
He wants to provide theoretical models and also build practical systems for NLP. In the classroom, Liang strives to give undergraduates a solid foundation in CS concepts, especially so they can stay abreast of the frontiers of the research community. For graduate students, he hopes to nurture their individual research directions through recommended reading, potential project topics and more. As they find their footing as researchers in their own right, he wants to guide them.
U niversity of W isconsin —Madison. Previous post: Theo Rekatsinas joins database faculty, employs data for major societal impact. Next post: Squashing the bugs in machine learning: Researchers make computer-trained models more trustworthy.Yingyu Lianga new faculty member in machine learning in the University of Wisconsin-Madison Department of Computer Sciences, enjoys bringing his disparate interests together.
In his free time, he enjoys hobbies ranging from badminton to classical Chinese poetry. Within computer science, he was drawn to machine learning because it pulls together so many things that fascinate him. After finishing his Ph. There, he did a postdoc and then spent two years as a research scholar and lecturer. He arrived in Madison with his wife and toddler son in fall Lubar Professorship. Then, while earning his Ph.
As he explains, even though there are many successful applications of machine learning already, there is little understanding about underlying theoretical principles. Liang is also eager to contribute to progress in natural language processing NLP —the part of computer science that deals with the interaction between computers and human languages.
For example, NLP comes into play when you interact with a personal digital assistant like Siri, Alexa or Cortana on a smartphone or in-home device like Amazon Echo.
Donate to arXiv
He wants to provide theoretical models and also build practical systems for NLP. In the classroom, Liang strives to give undergraduates a solid foundation in CS concepts, especially so they can stay abreast of the frontiers of the research community.
For graduate students, he hopes to nurture their individual research directions through recommended reading, potential project topics and more. As they find their footing as researchers in their own right, he wants to guide them. Story courtesy of the Department of Computer Sciences.
Submitted photo. Professor KathyJCramer talks about the importance of listening and empathy…. Latest Posts. Latest Photos.
News Give My UW.He served as the chancellor and regent of the state of Shu Han during the Three Kingdoms period. He is recognised as the most accomplished strategist of his era, and has been compared to Sun Tzuthe author of The Art of War. Zhuge Liang is often depicted wearing a Taoist robe and holding a hand fan made of crane feathers.
Zhuge Liang was a Confucian -oriented  "Legalist". In remembrance of his governance, local people maintained shrines to him for ages. Zhuge is a two-character Chinese compound family name. His name — even his surname alone — has become synonymous with loyalty, intelligence and strategy in Chinese culture. He had a habit of comparing himself to the sage minister Guan Zhong and military leader Yue Yi. Huang Chengyan once told Zhuge Liang, "I heard that you're seeking a spouse.
I've an ugly daughter with a yellow face and dark complexion, but her talent matches yours. Sima Hui compared Zhuge Liang to a sleeping dragon. When Liu Bei was residing at Xinye County and taking shelter under Jing Province 's governor, Liu Biaohe visited Sima Huiwho told him, "Confucian academics and common scholars, how much do they know about current affairs? Those who analyse current affairs well are elites.
Crouching Dragon and Young Phoenix are the only ones in this region. However, Xu Shu replied, "You must visit this man in person. He cannot be invited to meet you. This is contradicted in the later Annotations by Pei Songzhi which claim Zhuge Liang visited him first. Afterwards, Liu Bei became very close to Zhuge Liang and often had discussions with him.
Guan Yu and Zhang Fei were displeased with their relationship and complained about it. I hope you'll stop making unpleasant remarks. When Liu Bei heard of Liu Cong's surrender, he led his followers both troops and civilians on an exodus southward to Xiakouengaging Cao Cao's forces in a brief skirmish at the Battle of Changban along the way. Zhuge Liang met Sun Quan in Chaisang and proposed two solutions to him, "If you can use the forces of Wuyue to resist the central government, why not break ties with Cao Cao in advance?
If you cannot oppose, why not demobilise the troops, discard your armour and surrender to the north? He was put in charge of governing Lingling present day YongzhouHunanGuiyang and Changsha commanderies and collecting taxes to fund the military.
The following year, Liu Zhang discovered Liu Bei's intention, and the two turned hostile and waged war on each other. Whenever Liu Bei embarked on military campaigns, Zhuge Liang remained to defend Chengdu and ensured a steady flow of supply of troops and provisions.
Inin response to Cao Pi 's usurping of Emperor Xian 's throne, Liu Bei's subordinates advised him to declare himself emperor. Liu Bei named Zhuge Liang his chancellor and put him in charge of the imperial agency where Zhuge assumed the functions of the head of the imperial secretariat. In the spring ofLiu Bei retreated to Yong'an present-day Fengjie CountyChongqing after his defeat at the Battle of Xiaoting and became seriously ill.
He summoned Zhuge Liang from Chengdu and said to him, "You're ten times more talented than Cao Piand capable of both securing the country and accomplishing our great mission. If my son can be assisted, then assist him. If he proves incompetent, then you may take over the throne.
Not long later, Zhuge Liang was appointed governor of Yi Province and put in charge of all state affairs.To browse Academia. Skip to main content. Log In Sign Up. Location Atlanta, Georgia, United States. Unfollow Follow Unblock.
Other Affiliations:. Publication Date: View on keg. Save to Library. Vocabulary-based hashing for image search more. This paper proposes a hash function family based on feature vocabularies and investigates the application in building indexes for image search. Each hash function is associated with a set of feature points, i. The function family can be employed to build a high-dimensional.
View on portal. Approximate near neighbor search plays a critical role in various kinds of multimedia applications. The vocabulary-based hashing scheme uses vocabularies, i. The function family can be employed to build an approximate near neighbor search index. The critical problem in vocabulary-based hashing is the criteria of choosing vocabularies. This paper proposes a approach to greedily choosing vocabularies via Adaboost.
An index quality criterion is designed for the AdaBoost approach to adjust the weight of the training data. We also describe the parallelized version of the index for large scale applications. The promising results of the near-duplicate image detection experiments show the efficiency of the new vocabulary construction algorithm and desired qualities of the parallelized vocabulary-based hashing for large scale applications.
View on springerlink. Remember me on this computer. Enter the email address you signed up with and we'll email you a reset link. Need an account? Click here to sign up.My goal is to develop trustworthy systems that can communicate effectively with people and improve over time through interaction. Computers can do a lot, but tapping into their full power requires the rather non-trivial ability to program. I'm interested in building systems that learn to translate natural language descriptions e.
Such systems would unlock the full power of computing to a much wider audience.
A while back, I wrote a friendly introduction to natural language interfaces XRDS magazine and a slightly more technical survey article on executable semantic parsing CACM One idea we've explored is to "naturalize" a programming language gradually into a natural language ACL One can also use natural language to describe classifiers directly rather than requiring labeled data ACL The tension between the fuzziness of machine learning and the crispness of logic also fascinates me.
Despite the successes of machine learning, otherwise high-performing models are still difficult to debug and fail catastrophically in the presence of changing data distributions and adversaries. For example, on the SQuAD reading comprehension dataset we created EMNLPwe showed that state-of-the-art systems, despite reaching human-level benchmark performance, are easily fooled by distracting sentences in a way that no human would be EMNLP Given society's increasing reliance on machine learning, it is critical to build tools to make machine learning more reliable in the wild.
We've worked on using influence functions to understand black-box models ICMLsemidefinite programming to provide certificates a neural network is safe from a class of adversaries NeurIPSand distributionally robust optimization to ensure the fairness of machine learning models over time ICML Finally, I am a strong proponent of efficient and reproducible research.
We have been developing CodaLab Worksheetsa platform that allows researchers to run and manage their experiments by maintaining the full provenance of an experiment from raw data to final results. Most of our recent papers have been published on CodaLab as executable papers. We are actively looking for contributors, so please contact me if you're interested! Research My goal is to develop trustworthy systems that can communicate effectively with people and improve over time through interaction.
Here is some code for older projects. Current Ph.We address the challenging problem of deep representation learning—the e An emerging problem in trustworthy machine learning is to train models t We study the strong duality of non-convex matrix factorization: we show We show that training of generative adversarial network GAN may not ha Non-negative matrix factorization is a popular tool for decomposing data Neural networks are a powerful class of functions that can be trained wi Many applications require recovering a ground truth low-rank matrix from The general perception is that kernel methods are not scalable, and neur Learning sparse combinations is a frequent theme in machine learning.
The typical algorithmic problem in viral marketing aims to identify a se This paper provides new algorithms for distributed clustering for two po Several research groups have shown how to correlate fMRI responses to th Word embeddings are ubiquitous in NLP and information retrieval, but it' Semantic word embeddings represent the meaning of a word via a vector, a Non-negative matrix factorization is a basic tool for decomposing data i A recent line of research proposed either implicitly or explicitly gra Generic word embeddings are trained on large-scale generic corpora; Doma Motivations like domain adaptation, transfer learning, and feature learn Virtual high-throughput screening provides a strategy for prioritizing c Neural networks have many successful applications, while much less theor We approach the problem of generalizing pre-trained word embeddings beyo We consider the tensor completion problem of predicting the missing entr Neural networks have great success in many machine learning applications This paper proposes a way to improve the performance of existing algorit Multimodal language analysis often considers relationships between featu Fine-tuning pre-trained sentence embedding models like BERT has become t Are you a researcher?
Expose your work to one of the largest A.How did you get into your field of research? What attracted you to UW—Madison? I was attracted by the strong reputation of the Department of Computer Science, and the great research in my field in the department and across the campus. The Wisconsin Institute of Discovery also provides great opportunities to do interdisciplinary research. What was your first visit to campus like?
I had great meetings with faculty members here, learned a lot about the department and the University, and enjoyed a tour around the campus.
Favorite place on campus? Coffee shop in the Discovery building. Great coffee and tea. What are you most enjoying so far about working here? The most enjoyable part would be talking and collaborating with people with various backgrounds and great ideas. Machine learning techniques have enabled us to build smarter and smarter devices and thus found many applications in different scenarios.
Some of them are quite unexpected. For example, people are using machine learning methods for natural languages to translate obscure legalese in contracts into plain language and help attorneys explore large volumes of documents for a case.
Tags: computer sciencefaculty. University of Wisconsin—Madison. Yingyu Liang. Share via Facebook. Share via Twitter. Share via Linked In.