A novel omnidirectional visual sensor called the Panoptic camera is introduced. The Panoptic camera is an omnidirectional multi-aperture visual system which is realized by mounting multiple camera sensors on a hemispherical frame. The introduced visual sensor is compared against existing state-of-the-art omnidirectional sensor solutions. A novel positioning scheme of camera sensors over a spherical geometry is introduced. The positioning scheme is derived through the technique of populating the sphere surface with constant area circular faces. A geometrical approach is presented for analyzing some of the fundamental limits of this visual sensor. It is shown how the Voronoi tessellation can be utilized to extract the full-view and depth-map coverage distances of the Panoptic camera. The omnidirectional vision reconstruction algorithm based on the light-field interpolation for this visual sensor is explained. The nearest neighbor and linear interpolation techniques for omnivision reconstruction algorithm are discussed. A centralized hardware approach is demonstrated for the real-time implementation of the omnidirectional vision reconstruction algorithm of the Panoptic camera. In the centralized approach, all the camera video streams are gathered and processed by one single unit. The performance and resource requirement of this approach is shown in quantifiable terms for the omnivision application. A custom-made FPGA platform is shown with implementation results of the centralized approach for two developed Panoptic prototype systems. The distributed and parallel implementation of the omnidirectional vision reconstruction algorithm of the Panoptic system is presented. The hardware mapping of this approach is presented for camera modules with processing, memory and interconnectivity feature. The resource and performance requirement of such a camera module and the total system are derived in quantifiable terms for the omnivision application. The concept of interconnected network of camera is introduced. This model comprises cameras, a central unit and an interconnection network. Each camera is assumed to have local processing, memory and router interconnectivity in addition to its imaging capability. The cameras can interact with each other and the central unit via the interconnection network that they form, through their router connection ports. The interconnection network theory is succinctly reviewed. A methodology is introduced for the arrangement of camera modules with interconnectivity feature into a target interconnection network topology. The camera arrangement within an interconnected network is defined as a classical facility allocation problem. The quadratic assignment problem is utilized for assigning the cameras of a Panoptic system to a target topology. The Voronoi diagram of the Panoptic system provides a planar graph which is used in the quadratic assignment problem. The vertex-p center problem is used for selecting a few service points within the interconnection network to have access to the central unit. A ratified cycle-accurate simulation tool is introduced for performance analysis of the interconnected network of camera in terms of throughput and latency. The traffic pattern generated by the distributively implemented algorithms are extracted and provided as inputs to the simulator. Latency versus throughput graphs are extracted. Simulation results for interconnected network of cameras with different router parameters are demonstrated. A design guideline for the implementation of an interconnection network of cameras using the extracted simulation results is shown within a case study. A unique custom-made FPGA hardware platform is introduced for the emulation of an interconnected network of cameras of a prototype Panoptic system. The real-time implementation results of the implemented omnivision application on the FPGA hardware platform are demonstrated. Conclusions are presented and future work is discussed.