Sunday, May 10, 2020

Spatial computing

Abstract on Spatial computing



Spatial computing is human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces. It is an essential component for making our machines fuller partners in our work and play. This thesis presents a series of experiments in the discipline and analysis of its fundamental properties.

Spatial Computing is the practice of using physical space to send input to and receive output from a computer.


What is spatial computing ?


Spatial computing is digital technology that interacts with us in the places we live, work and play. A spatial computer, like Magic Leap 1, knows where it is in space. It uses a variety of sensors and cameras to build an understanding of both its environment and its user. This enables immersive, mixed-reality experiences that seamlessly blend the digital and the real world.

Spatial Computing Devices


VR Headsets

Hardware development in the last few years has accelerated to the point where we have a large spectrum of wearable Virtual Reality products. These include the high-end HTC Vive, Oculus Rift, and Samsung Gear VR all the way to economical smartphone-powered headsets at most major retailers.

AR Glasses

Glasses and goggles equipped to project data and two-dimensional imagery are more of a specialty product than VR headsets. They’re generally designed for workplace and industrial applications. Products like the Google Glass and Vuzix Blade are good examples, but with Microsoft’s HoloLens and Magic Leap’s Lightwear poised to access mainstream consumers, we will soon see AR wearables gaining ground on their larger, heavier VR counterparts. 

Hybrid Gear

A hybrid wearable is one that shifts effectively between VR and AR – ideally encompassing MR as well. A truly versatile holistic pair of XR glasses is still in the R&D phase with several tech giants. We expect to see huge players like Apple, Samsung, Google, and Microsoft acquiring start-ups, cherry-picking the best cutting edge XR hardware and software until they can produce affordable consumer-ready glasses that realize the hybrid dream. It’s not decades away, but it’s not here yet.




https://www.stambol.com/2019/03/12/spatial-computing-in-less-than-140-characters-and-more/
https://medium.com/@victoragulhon/what-is-spatial-computing-777fae84a499
https://developer.magicleap.com/en-us/learn/guides/design-spatial-computing

latest seminar topics on Computer security for Computer Science

Computer security, cybersecurity or information technology security (IT security) is the protection of computer systems and networks from the theft of or damage to their hardware, software, or electronic data, as well as from the disruption or misdirection of the services they provide.The field is becoming more important due to increased reliance on computer systems, the Internet and wireless network standards such as Bluetooth and Wi-Fi, and due to the growth of "smart" devices, including smartphones, televisions, and the various devices that constitute the "Internet of things". Owing to its complexity, both in terms of politics and technology, cybersecurity is also one of the major challenges in the contemporary world.

Internet security
Automotive security
Cybersex trafficking
Cyberwarfare
Computer security
Mobile security
Network security

Advanced persistent threat
Computer crime
Vulnerabilities
Eavesdropping
Malware
Spyware
Ransomware
Trojans
Viruses
Worms
Rootkits
Bootkits
Keyloggers
Screen scrapers
Exploits
Backdoors
Logic bombs
Payloads
Denial of service
Web shells
Web application security
Phishing

Computer access control
Application security
Antivirus software
Secure coding
Secure by default
Secure by design
Secure operating systems
Authentication
Multi-factor authentication
Authorization
Data-centric security
Encryption
Firewall
Intrusion detection system
Mobile secure gateway
Runtime application self-protection (RASP)

latest seminar topics on computer science Edge detection,Virtual private cloud,Data science,Machine learning

Edge detection


Edge detection includes a variety of mathematical methods that aim at identifying points in a digital image at which the image brightness changes sharply or, more formally, has discontinuities. The points at which image brightness changes sharply are typically organized into a set of curved line segments termed edges. The same problem of finding discontinuities in one-dimensional signals is known as step detection and the problem of finding signal discontinuities over time is known as change detection. Edge detection is a fundamental tool in image processing, machine vision and computer vision, particularly in the areas of feature detection and feature extraction.


Virtual private cloud


A virtual private cloud (VPC) is an on-demand configurable pool of shared computing resources allocated within a public cloud environment, providing a certain level of isolation between the different organizations (denoted as users hereafter) using the resources. The isolation between one VPC user and all other users of the same cloud (other VPC users as well as other public cloud users) is achieved normally through allocation of a private IP subnet and a virtual communication construct (such as a VLAN or a set of encrypted communication channels) per user. In a VPC, the previously described mechanism, providing isolation within the cloud, is accompanied with a VPN function (again, allocated per VPC user) that secures, by means of authentication and encryption, the remote access of the organization to its VPC resources. With the introduction of the described isolation levels, an organization using this service is in effect working on a 'virtually private' cloud (that is, as if the cloud infrastructure is not shared with other users), and hence the name VPC. VPC is most commonly used in the context of cloud infrastructure as a service. In this context, the infrastructure provider, providing the underlying public cloud infrastructure, and the provider realizing the VPC service over this infrastructure, may


Data science


Data science is an inter-disciplinary field that uses scientific methods, processes, algorithms and systems to extract knowledge and insights from many structural and unstructured data. Data science is related to data mining, deep learning and big data. Data science is a "concept to unify statistics, data analysis, machine learning and their related methods" in order to "understand and analyze actual phenomena" with data. It uses techniques and theories drawn from many fields within the context of mathematics, statistics, computer science, and information science. Turing award winner Jim Gray imagined data science as a "fourth paradigm" of science (empirical, theoretical, computational and now data-driven) and asserted that "everything about science is changing because of the impact of information technology" and the data deluge.

Machine learning


Machine learning (ML) is the study of computer algorithms that improve automatically through experience. It is seen as a subset of artificial intelligence. Machine learning algorithms build a mathematical model based on sample data, known as "training data", in order to make predictions or decisions without being explicitly programmed to do so. Machine learning algorithms are used in a wide variety of applications, such as email filtering and computer vision, where it is difficult or infeasible to develop conventional algorithms to perform the needed tasks.


VLAN hopping


VLAN hopping is a computer security exploit, a method of attacking networked resources on a virtual LAN (VLAN). The basic concept behind all VLAN hopping attacks is for an attacking host on a VLAN to gain access to traffic on other VLANs that would normally not be accessible. There are two primary methods of VLAN hopping: switch spoofing and double tagging. Both attack vectors can be mitigated with proper switch port configuration.

Data mining 


Data miningis the process of discovering patterns in large data sets involving methods at the intersection of machine learning, statistics, and database systems. Data mining is an interdisciplinary subfield of computer science and statistics with an overall goal to extract information (with intelligent methods) from a data set and transform the information into a comprehensible structure for further use. Data mining is the analysis step of the "knowledge discovery in databases" process or KDD. Aside from the raw analysis step, it also involves database and data management aspects, data pre-processing, model and inference considerations, interestingness metrics, complexity considerations, post-processing of discovered structures, visualization, and online updating.

Virtual LAN


A virtual LAN (VLAN) is any broadcast domain that is partitioned and isolated in a computer network at the data link layer (OSI layer 2). LAN is the abbreviation for local area network and in this context virtual refers to a physical object recreated and altered by additional logic. VLANs work by applying tags to network frames and handling these tags in networking systems – creating the appearance and functionality of network traffic that is physically on a single network but acts as if it is split between separate networks. In this way, VLANs can keep network applications separate despite being connected to the same physical network, and without requiring multiple sets of cabling and networking devices to be deployed.

Sass (stylesheet language)


Sass (short for syntactically awesome style sheets) is a style sheet language initially designed by Hampton Catlin and developed by Natalie Weizenbaum. After its initial versions, Weizenbaum and Chris Eppstein have continued to extend Sass with SassScript, a scripting language used in Sass files.Sass is a preprocessor scripting language that is interpreted or compiled into Cascading Style Sheets (CSS). SassScript is the scripting language itself. Sass consists of two syntaxes. The original syntax, called "the indented syntax," uses a syntax similar to Haml. It uses indentation to separate code blocks and newline characters to separate rules. The newer syntax, "SCSS" (Sassy CSS), uses block formatting like that of CSS. It uses braces to denote code blocks and semicolons to separate rules within a block. The indented syntax and SCSS files are traditionally given the extensions .sass and .scss, respectively.

Virtual private network


A virtual private network (VPN) extends a private network across a public network and enables users to send and receive data across shared or public networks as if their computing devices were directly connected to the private network. Applications running on an end system (PC, smartphone etc.) across a VPN may therefore benefit from the functionality, security, and management of the private network. Encryption is a common, though not an inherent, part of a VPN connection.


ERP  (Enterprise Resource Planning )


Enterprise resource planning (ERP) is business process management software that allows an organization to use a system of integrated applications to manage the business and automate many back office functions related to technology, services and human resources. ERP software integrates all facets of an operation, including product planning, development, manufacturing, sales and marketing. An important goal oF ERP is to facilitate the flow of information so business decisions can be data-driven. ERP software suites are built to collect and organize data from various levels of an organization to provide management with insight into key performance indicators (KPIs) in real time.

Claytronics


Claytronics is a system designed to implement the concept of programmable matter, that is, material which can be manipulated electronically in three dimensions in the same way that two-dimensional images can be manipulated through computer graphics. Such materials would be composed of “catoms” — claytronics atoms — which would, in analogy with actual atoms, be the smallest indivisible units of the programmable matter. As of 2006 researchers have already created a prototype catom that is 44 millimeters in diameter. The goal is to eventually produce catoms that are one or two millimeters in diameter-small enough to produce convincing replicas.





Google cloud computing (GCP)

Abstract on Google cloud computing:

Google Cloud Platform (GCP) is Google’s public cloud offering comparable to Amazon Web Services and Microsoft Azure. The difference is that GCP is built upon Google's massive, cutting-edge infrastructure that handles the traffic and workload of all Google users. There is a wide range of services available in GCP ranging from Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) to completely managed Software-as-a-Service (SaaS). We will discuss the available infrastructure components and how they provide a powerful and flexible foundation on which to build your applications.


Google Cloud Platform (GCP) is one of the leaders among cloud APIs. Although it was established only five years ago, GCP has gained notable expansion due to its suite of public cloud services that it based on a huge, solid infrastructure. GCP allows developers to use these services by accessing GCP RESTful API that is described through HTML pages on its website. However, the documentation of GCP API is written in natural language (English prose) and therefore shows several drawbacks, such as Informal Heterogeneous Documentation, Imprecise Types, Implicit Attribute Metadata, Hidden Links, Redundancy and Lack of Visual Support.

What is Cloud Computing:


Cloud computing is a highly scalable and cost-effective infrastructure for running HPC, enterprise and Web applications. However, the growing demand of Cloud infrastructure has drastically increased the energy consumption of data centers, which has become a critical issue. High energy consumption not only translates to high operational cost, which reduces the profit margin of Cloud providers, but also leads to high carbon emissions which is not environmentally friendly. Hence, energy-efficient solutions are required to minimize the impact of Cloud computing on the environment. In order to design such solutions, deep analysis of Cloud is required with respect to their power efficiency. For more about Seminar Topic on Cloud Computing Check here

What is google cloud computing:

Google Cloud Platform (GCP), offered by Google, is a suite of cloud computing services that runs on the same infrastructure that Google uses internally for its end-user products, such as Google Search, Gmail and YouTube.Alongside a set of management tools, it provides a series of modular cloud services including computing, data storage, data analytics and machine learning. Registration requires a credit card or bank account details.

Google Cloud Platform provides infrastructure as a service, platform as a service, and serverless computing environments.


Google Cloud reference architecture

https://cloud.google.com/migrate/compute-engine/docs/4.5/concepts/architecture/gcp-reference-architecture


What are Google Cloud Platform (GCP) Services?


Google offers a wide range of Services. Following are the major Google Cloud Services:

  • Compute
  • Networking
  • Storage and Databases
  • Big Data
  • Machine Learning
  • Identity & Security
  • Management and Developer Tools
  • Cloud AI
  • IoT
  • API Platform


Reference Site:
https://en.wikipedia.org/wiki/Google_Cloud_Platform
https://www.edureka.co/blog/what-is-google-cloud-platform/

Sunday, May 3, 2020

Web Scraping

From the evolution of WWW, the scenario of internet user and data exchange is fastly changes. As common people join the internet and start to use it, lots of new techniques are promoted to boost up the network. At the same time, to enhance computers and network facility new technologies were introduces which results into automatically decreasing in cost of hardware and website’s related costs. Due to all these changes, large number of users are joined and use the internet facilities. Daily use of internet cose in to a tremendous data is available on internet. Business, academician,  researchers all are share their advertisements, information on internet so that they can be connected to people fastly and easily. 

As a result of exchange, share and store data on internet, a new problem is arise that how to handle such data overload and how the user will get or access the best information in least efforts. To solve this issues, researcher spotout new technique called Web Scraping. Web scraping is very imperative technique which is used to generate structured data on the basis of available unstructured data on the web.
https://www.edureka.co/blog/web-scraping-with-python/ :image source


Scaping generated structured data then stored in central database and analyze in spreadsheets. Traditional copy-and-paste, Text grapping and regular expression matching, HTTP programming, HTML parsing, DOM parsing, Webscraping software, Vertical aggregation platforms, Semantic annotation recognizing and Computer vision web-page analyzers are some of the common techniques used for data scraping. Previously most user uses the common copy-pest technique for gathering and analyzing data on the internet, but it is a tedious technique where lot of data copied by the user and store on computer files. 

As compared to this technique web scraping software is easiest scraping technique. Now a days, there are lots of software are available in the market for web scraping. Our paper is focused on the overview on the information extraction technique i.e. web scraping, different techniques of web scraping and some of the recent tools used for a web scraping. 

Keywords- Web mining, information extraction, web scraping


What is web scraping?


If you’ve ever copy and pasted information from a website, you’ve performed the same function as any web scraper, only on a microscopic, manual scale.

Web scraping, also known as web data extraction, is the process of retrieving or “scraping” data from a website. Unlike the mundane, mind-numbing process of manually extracting data, web scraping uses intelligent automation to retrieve hundreds, millions, or even billions of data points from the internet’s seemingly endless frontier.

More than a modern convenience, the true power of web scraping lies in its ability to build and power some of the world’s most revolutionary business applications. ‘Transformative’ doesn’t even begin to describe the way some companies use web scraped data to enhance their operations, informing executive decisions all the way down to individual customer service experiences. 




Why Web Scraping?

Web scraping is used to collect large information from websites. But why does someone have to collect such large data from websites? To know about this, let’s look at the applications of web scraping:

Price Comparison: Services such as ParseHub use web scraping to collect data from online shopping websites and use it to compare the prices of products.

Email address gathering: Many companies that use email as a medium for marketing, use web scraping to collect email ID and then send bulk emails.

Social Media Scraping: Web scraping is used to collect data from Social Media websites such as Twitter to find out what’s trending.

Research and Development: Web scraping is used to collect a large set of data (Statistics, General Information, Temperature, etc.) from websites, which are analyzed and used to carry out Surveys or for R&D.

Job listings: Details regarding job openings, interviews are collected from different websites and then listed in one place so that it is easily accessible to the user.


How does Web Scraping work?

When you run the code for web scraping, a request is sent to the URL that you have mentioned. As a response to the request, the server sends the data and allows you to read the HTML or XML page. The code then, parses the HTML or XML page, finds the data and extracts it.

To extract data using web scraping with python, you need to follow these basic steps:

  • Find the URL that you want to scrape
  • Inspecting the Page
  • Find the data you want to extract
  • Write the code
  • Run the code and extract the data
  • Store the data in the required format 

What are the different scraping technique


Manual scraping
  •     Copy-pasting

Automated Scraping
  •     HTML Parsing
  •     DOM Parsing
  •     Vertical Aggregation
  •     XPath
  •     Google Sheets
  •     Text Pattern Matching


Sources / References:

http://www.ijfrcsce.org/download/browse/Volume_4/April_18_Volume_4_Issue_4/1524638955_25-04-2018.pdf

https://scrapinghub.com/what-is-web-scraping

https://www.edureka.co/blog/web-scraping-with-python/

Web Scraping PPThttps://www.slideshare.net/SelectoCompany/web-scraping-76200621