![]() High-energy physics experiments produce huge amounts of data that need to be processed and stored for further analysis and eventually treated in real time for triggering and monitoring purposes. With much progress, thoughts and new directions made in recent years, we strongly feel we have reached the perfect time where meeting together and sharing our experiences will be highly beneficial. Beyond those examples, non-accelerator experiments are also seeking novel computing models as their apparatus and operation become larger and distributed. In all these projects, computing and software will be even more important than before. For the future, developments are progressing towards the construction of ILC. In nuclear physics field, FAIR is under construction and RHIC well engaged into its Phase-II research program facing increased datasets and new challenges with precision Physics. On the side of the intensity frontier, Super-KEKB will start commissioning in 2015, and fixed-target experiments at CERN, Fermilab and J-PARC are growing bigger in size. Experimental groups at the LHC have reviewed their Run 1 experiences in detail, acquired the latest computing and software technologies, and constructed new computing models to prepare for Run 2. ![]() "Evolution of Software and Computing for Experiments" ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |