Popular public knowledge bases like DBpedia, Yago, Freebase, etc contain information about named events such as conferences, concerts, earthquakes, summits and so on. However these knowledge bases are not aware of fine grained events. For example, the YAGO kb does not contain any information about the event "Champions League 2012/2013 final". The main reason is that KBs like YAGO only tap into Wikipedia or similarly curated resources to construct knowledge about named events. However, these resources do not include too specific fine grained events as seperate entities, whereas such small events are extensively covered in news sources. Thus our goal in this work is to extract named events from news articles, reconciling them into canonicalized events and organizing them into semantic classes to populate a knowledge base. We call this system EVIN which stands for "EVents In News". EVIN infers semantic classes of events via statistical language models. Besides semantic classes, EVIN also exploits different kinds of similarity mea- sures among news, referring to textual contents, entity occurrences, and temporal ordering. These similarities are captured in a multi-view attributed graph. To distill canon- icalized events, EVIN coarsens the graph by iterative merg- ing based on a judiciously designed loss function. The graph coarsening algorithms in EVIN framework do three important tasks at once: 1) Grouping of news into named events such that the news reporting the same event will be merged into one coarse node. 2) Temporal chaining of events in the coarse graph is induced. Thus, EVIN can find whether related events succeed/preceed each other. 3) Entity and semantic class sets of the events are induced. Hence, EVIN can identify the most important entities and classes of events from the news composing the event. Finally, the generated events are put into the KB. Graph coarsening in a nutshell is iteratively merging nodes into coarser nodes up to a stopping criterion. The resulting graph should largely preserve the structural properties of the original graph. Lets see how EVIN proto-type functions for few secenerios. EVIN WEB User Interface provides a traditional yet easy way for interacting the system: querying :-) The user can type a keyword query to retrieve the events relevant to the query. Lets type "German teams" to see the events related to our query. These results show the events related to our query. The user can click on these results to see further info about them. Before that I would like to show how we can explore the events via semantic classes. Users can choose a semantic class, lets say "cup finals". These following events are the all sport finals currently extracted. We used a relatively small corpus for the purposes of this screen cast. Otherwise, in the full fledged EVIN system many more results would be retrieved. Lets click the event 101. The chaining of the events are shown in this chaining section. The grouping is shown here. And we can see more info about the events upon clicking them. The related images are here. These images are found via querying a search engine by the most important entities of the events. The relatedness of the images to the events show the effectiveness of the algorithms in the EVIN system. Lets have one more example case for EVIN. Besides merely querying the system or exploring the events of a semantic class, users can also combine both properties: so called combined search and discovery. So, lets combine our first two cases into one. We type the query "German teams" and choose the class "cup final". These results are the only events in which German teams played a cup final. Lets click event 102 to discover more. Here is the infamous cup final between Bayern Munich and Borussia Dortmund. The semantic classes of the event 104 are tournament, sport, final, and championship. You can also see the related images to the event. It is also shown that Event 102 is sub event of Event 104. By clicking on the news items, users can see the image, the publication date and also the link to the full article. Thank You!