There are no reviews yet. Be the first to send feedback to the community and the maintainers!
Cluster-Benchmarking
Cluster Benchmarking of CPU, RAM, I/O and network using CAutomatic-smear-detection-on-the-camera-lens-from-given-street-view-images
detect smear present on the camera lens through various image processing algorithmstranslates-the-virtual-address-into-Physical-address
uv2p() – uv2p system call translates the virtual address into Physical address.Computer_Vision_projects
All the CV codesConvolution-network-for-Bush-and-William-Data-Set
Convolution network for Bush and William Data Set to identify if they are in the pic or notHeartbeat_data_analysis
Heartbeat_data_analysis after reading ECG dataDeep-neural-network-with-limiter-precision-on-GPUs-using-python
Deep neural network with limited precision on GPUs using pythonCoupon-Inventory-System
Coupon Inventory System using JAVASort-on-Single-shared-memory
sorting on external and multi-threaded programmingimplement-semi-colon-and-and-operator-in-XV6
implement semi-colon and and operator in XV6Deep-learning-with-limited-precision-using-GENANN
Deep learning with limited precision using GENANN in C. precisions like Int 8, int 16, Float 32 and Float 16Generl-Sorting-for-Teragigabit-of-Data
External data sort and multithreaded programming is performed on single node shared memory for two given datasets of 2 GB and 20 GB each. The purpose of the project is to sort datasets larger than the size of the memory using External Sorting Technique. The results are evaluated and then comparison is carried out with the implementation of the results of Linux sorting. This document illustrates graphically the strategy for implementing the sorting followed by merging. The document also puts forward tabular performance evaluation for the same.Sort-using-Spark-and-Hadoop
This project covers sort through Hadoop and Spark on multiple nodes. We have used Linux system to implement your application and used the Proton Cluster accessible at 216.47.142.37; each Hadoop group (cluster) has 4 nodes, each node having 4-cores, 8GB of memory, and 80GB of SSD storage.Love Open Source and this site? Check out how you can help us